Skip to content

ema / config / LLMConfig

Class: LLMConfig

Defined in: packages/ema/src/config.ts:173

LLM configuration.

Constructors

Constructor

ts
new LLMConfig(
   openai, 
   google, 
   chat_provider, 
   chat_model, 
   retry): LLMConfig;

Defined in: packages/ema/src/config.ts:174

Parameters

openai

OpenAIApiConfig = ...

OpenAI API configuration.

google

GoogleApiConfig = ...

Google API configuration

chat_provider

Provider name used for chat agent. If environment variable EMA_CHAT_PROVIDER is set, it will be used first.

Examples

yaml
# Configure to use Google Generative AI in config.yaml
llm:
  chat_provider: "google"
  google:
    key: "sk-1234567890"
yaml
# Configure to use model "gemini-2.5-flash" and Google Generative AI in config.yaml
llm:
  chat_provider: "google"
  chat_model: "gemini-2.5-flash"
  google:
    key: "sk-1234567890"
env
# Configure to use deepseek in .env
EMA_CHAT_PROVIDER=openai
EMA_CHAT_MODEL=deepseek-chat
OPENAI_API_KEY=sk-1234567890
OPENAI_API_BASE=https://api.deepseek.com

"google" | "openai"

chat_model

string = "gemini-2.5-flash"

Model name used for chat agent. If environment variable EMA_CHAT_MODEL is set, it will be used first.

See

chat_provider for examples.

retry

RetryConfig = ...

Retry configuration for the LLM provider.

Returns

LLMConfig

Properties

chat_model

ts
chat_model: string = "gemini-2.5-flash";

Defined in: packages/ema/src/config.ts:222

Model name used for chat agent. If environment variable EMA_CHAT_MODEL is set, it will be used first.

See

chat_provider for examples.


chat_provider

ts
chat_provider: "google" | "openai" = "google";

Defined in: packages/ema/src/config.ts:215

Provider name used for chat agent. If environment variable EMA_CHAT_PROVIDER is set, it will be used first.

Examples

yaml
# Configure to use Google Generative AI in config.yaml
llm:
  chat_provider: "google"
  google:
    key: "sk-1234567890"
yaml
# Configure to use model "gemini-2.5-flash" and Google Generative AI in config.yaml
llm:
  chat_provider: "google"
  chat_model: "gemini-2.5-flash"
  google:
    key: "sk-1234567890"
env
# Configure to use deepseek in .env
EMA_CHAT_PROVIDER=openai
EMA_CHAT_MODEL=deepseek-chat
OPENAI_API_KEY=sk-1234567890
OPENAI_API_BASE=https://api.deepseek.com

google

ts
readonly google: GoogleApiConfig;

Defined in: packages/ema/src/config.ts:182

Google API configuration


openai

ts
readonly openai: OpenAIApiConfig;

Defined in: packages/ema/src/config.ts:178

OpenAI API configuration.


retry

ts
readonly retry: RetryConfig;

Defined in: packages/ema/src/config.ts:226

Retry configuration for the LLM provider.