Models

Configure available LLM models and provider settings.

The Models section controls which LLM models are available in your workspace and how they're configured.

Provider configuration

Set up each LLM provider:

OpenRouter gives access to 100+ models from all major providers through a single API key.

  • API Key — set in Credentials

  • Base URLhttps://openrouter.ai/api/v1 (default)

  • Default Model — the model used when agents don't specify one

Popular models via OpenRouter:

  • anthropic/claude-sonnet-4-20250514

  • google/gemini-2.5-pro

  • deepseek/deepseek-chat

  • openai/gpt-4

Default model

The default model is used when:

  • An agent doesn't have a model explicitly set

  • New agents are created without specifying a model

  • System-level tasks (routing, classification) need an LLM

Set the default in General Settings.

Model fallback

Configure a fallback chain for resilience:

  1. Primary — the agent's assigned model

  2. Fallback 1 — used if primary is unavailable or rate-limited

  3. Fallback 2 — last resort

circle-info

Using OpenRouter as your primary provider gives you built-in fallback — OpenRouter routes to alternative models automatically if one is down.

Embedding model

The embedding model is used for document chunking and semantic search:

  • Default: qwen/qwen3-embedding-8b via OpenRouter

  • Dimensions: 2048

  • Configurable in Settings

circle-exclamation

Last updated