Ollama

Run open-source LLMs locally with a simple CLI

open-sourceFree and open-source. Run on your own hardwareView full pricing →

Visit Ollama

https://ollama.com

About Ollama

Open-source tool for running large language models locally on your machine. Ollama makes it easy to download, run, and manage LLMs like Llama, Mistral, and Gemma with a simple CLI and API.

Key Features

One-command model download
Local inference
OpenAI-compatible API
Model library
GPU acceleration
Customizable models

Tags

local aiopen-sourcellmself-hostedcliprivacy

Alternatives to Ollama

View all Ollama alternatives →

Is Ollama down right now?

Check real-time status and outage history on API Status Check.

Check Ollama Status →