Ollama
Run open-source LLMs locally with a simple CLI
Visit Ollama
https://ollama.com
About Ollama
Open-source tool for running large language models locally on your machine. Ollama makes it easy to download, run, and manage LLMs like Llama, Mistral, and Gemma with a simple CLI and API.
Key Features
✓One-command model download
✓Local inference
✓OpenAI-compatible API
✓Model library
✓GPU acceleration
✓Customizable models
Tags
local aiopen-sourcellmself-hostedcliprivacy
🏷️
Is this your tool?
Claim your listing to get a Featured badge, edit your description, and stand out from competitors. All plans include a permanent dofollow backlink to your site.
Claim Now →Stay updated on Coding & Development tools — join our weekly newsletter
One concise email with fresh launches, trending picks, and featured standouts.