Ollama makes it fairly easy to download open-source LLMs. Even small models can run painfully slow. Don't try this without a new machine with 36GB of RAM. As a reporter covering artificial ...