Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langsight.dev/llms.txt

Use this file to discover all available pages before exploring further.

langsight investigate uses an LLM to analyse health evidence and produce root cause reports. You can choose from four providers.

Quick setup

Add an investigate block to .langsight.yaml:
investigate:
  provider: gemini           # anthropic | openai | gemini | ollama
  model: gemini-2.0-flash    # optional — sensible defaults per provider

Comparison

ClaudeOpenAIGeminiOllama
Free tierNoNo✅ 1,500/day✅ Unlimited
Data privacyAnthropicOpenAIGoogleOn your machine
Context window200K128K1MVaries
RCA quality⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐
Setup time2 min2 min2 min5 min

Environment variables

ProviderVariableWhere to get it
ClaudeANTHROPIC_API_KEYconsole.anthropic.com
OpenAIOPENAI_API_KEYplatform.openai.com/api-keys
GeminiGEMINI_API_KEYaistudio.google.com/app/apikey
Ollama(none required)
LLM provider API keys must be set as environment variables (ANTHROPIC_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY). Never put them in .langsight.yaml — the config file may be committed to version control. The api_key field has been removed from the investigate: config block as of v0.7.0.

Rule-based fallback

If no provider is configured or the API call fails, langsight investigate automatically falls back to deterministic heuristics — no LLM needed.

Claude

Best quality RCA with adaptive thinking.

OpenAI

GPT-4o for teams already on OpenAI.

Gemini

Free tier, 1M context window.

Ollama

Local, free, air-gapped.