Skip to main content

LLM Provider Configuration

Configure LLM providers for OSSA agents - Anthropic, OpenAI, Google, and more

LLM Provider Configuration

OSSA supports multiple LLM providers through a unified configuration interface.

Supported Providers

ProviderModelsStatus
AnthropicClaude 3.5, Claude 3 Opus/Sonnet/HaikuStable
OpenAIGPT-4o, GPT-4 Turbo, GPT-3.5Stable
GoogleGemini 2.0, Gemini 1.5Stable
MistralMistral Large, MixtralBeta
GroqLLaMA, MixtralBeta

Configuration

apiVersion: ossa/v0.4.9 kind: Agent spec: llm: provider: ${LLM_PROVIDER:-anthropic} model: ${LLM_MODEL:-claude-sonnet-4-20250514} temperature: ${LLM_TEMPERATURE:-0.1} maxTokens: ${LLM_MAX_TOKENS:-16000} # Fallback configuration fallback_models: - provider: openai model: gpt-4o condition: on_error - provider: google model: gemini-2.0-flash condition: on_rate_limit # Retry configuration retry_config: max_attempts: 3 backoff_strategy: exponential initial_delay_ms: 1000

Environment Variables

VariableDescriptionDefault
LLM_PROVIDERPrimary provideranthropic
LLM_MODELModel nameclaude-sonnet-4-20250514
LLM_TEMPERATUREResponse randomness0.1
LLM_MAX_TOKENSMax output tokens16000