Skip to content

LLM Providers

DAIV supports the following LLM providers:

You can mix providers — for example, use OpenRouter for the main agent and a direct provider for a specific model override.

How models are specified

Models use a prefix system to identify the provider:

Prefix Provider Example
openrouter: OpenRouter openrouter:anthropic/claude-sonnet-4.6
claude Anthropic (direct) claude-sonnet-4.6
gpt-, o4 OpenAI (direct) gpt-5.3-codex
gemini Google Gemini (direct) gemini-2.5-pro-preview-05-06

DAIV resolves the provider automatically from the model name. No extra configuration is needed beyond setting the API key for the provider you want to use.

Default models

Role Model Provider
Main agent Claude Sonnet 4.6 OpenRouter
Max mode Claude Opus 4.6 OpenRouter
Explore subagent Claude Haiku 4.5 OpenRouter
Fallback GPT 5.3 Codex OpenRouter

All defaults route through OpenRouter, so only OPENROUTER_API_KEY is required for a basic setup. Override models per repository in .daiv.yml — see Repository Config.


OpenRouter

OpenRouter is the default provider for DAIV. It provides access to models from multiple vendors with built-in fallback support.

Setup:

  1. Obtain an API key from OpenRouter Settings
  2. Set the environment variable:
    Text Only
    OPENROUTER_API_KEY=your-api-key-here
    

Model format:

Use the openrouter: prefix followed by the model name from OpenRouter:

Text Only
openrouter:anthropic/claude-sonnet-4.6
openrouter:openai/gpt-5.3-codex

OpenAI

Setup:

  1. Obtain an API key from OpenAI
  2. Set the environment variable:
    Text Only
    OPENAI_API_KEY=your-api-key-here
    

Model format:

Use the model name from OpenAI directly (no prefix needed):

Text Only
gpt-5.3-codex
o4-mini

Anthropic

Setup:

  1. Obtain an API key from Anthropic
  2. Set the environment variable:
    Text Only
    ANTHROPIC_API_KEY=your-api-key-here
    

Model format:

Use the model name from Anthropic directly (no prefix needed):

Text Only
claude-sonnet-4.6
claude-opus-4.6

Warning

We love Anthropic, but unfortunately their API is very unstable and often returns errors. Also, the rate limits could be exceeded very quickly.


Google Gemini

Setup:

  1. Obtain an API key from AI Studio
  2. Set the environment variable:
    Text Only
    GOOGLE_API_KEY=your-api-key-here
    

Model format:

Use the model name from Gemini directly (no prefix needed):

Text Only
gemini-2.5-pro-preview-05-06
gemini-2.4-flash-preview-04-17