Ollama: Easily run LLMs locally on macOS and Linux
Ollama: Easily run LLMs locally on macOS and Linux
ollama.ai
Ollama
Ollama is the easiest way to automate your work using open models, while keeping your data safe.

If you’re just wanting to run LLMs quickly on your computer in the command line, this is about as simple as it gets. Ollama provides an easy CLI to generate text, and there’s also a Raycast extension for more powerful usage.