Why Ollama?
- Completely free — no API costs
- Privacy — no data sent to external services
- Air-gapped — works without internet access
- No API key needed
Setup
- Install Ollama from ollama.com/download
- Pull a model:
- Verify Ollama is running:
- Configure in
.langsight.yaml:
OLLAMA_API_KEY needed.
Models
| Model | RAM | Quality | Speed |
|---|---|---|---|
llama3.2 | 2 GB | Good | Fast |
llama3.1:8b | 8 GB | Better | Medium |
mistral | 8 GB | Good | Medium |
qwen2.5:14b | 16 GB | Excellent | Slow |
Remote Ollama
If Ollama runs on another machine:Troubleshooting
ollama serve
ollama pull llama3.2