OpenAI
Configure OpenAI as an LLM provider in agentgateway.
Configuration
Review the following example configuration.# yaml-language-server: $schema=https://agentgateway.dev/schema/config
llm:
models:
- name: "*"
provider: openAI
params:
apiKey: "$OPENAI_API_KEY"| Setting | Description |
|---|---|
name | The model name to match in incoming requests. When a client sends "model": "<name>", the request is routed to this provider. Use * to match any model name. |
provider | The LLM provider, set to openai for OpenAI models. |
params.model | The specific OpenAI model to use. If set, this model is used for all requests. If not set, the request must include the model to use. |
params.apiKey | The OpenAI API key for authentication. You can reference environment variables using the $VAR_NAME syntax. |
binds/listeners/routes configuration format. See the Routing-based configuration guide for more information.Connect to Codex
Use agentgateway as a proxy to your OpenAI provider from the Codex client.
Create an agentgateway configuration without specifying a model, so the Codex client’s model choice is used.
cat > config.yaml << 'EOF' # yaml-language-server: $schema=https://agentgateway.dev/schema/config llm: models: - name: openai provider: openAI params: apiKey: "$OPENAI_API_KEY" EOFPoint Codex at agentgateway through one of the following methods.
Codex uses the OPENAI_BASE_URL environment variable to override the default OpenAI endpoint. Use a base URL that includes
/v1so requests go to/v1/responsesand OpenAI does not return 404.export OPENAI_BASE_URL="http://localhost:4000/v1" codexTo override the base URL for a single run, set
model_providerand the provider’snameandbase_url(the-cvalues are TOML).codex -c 'model_provider="proxy"' -c 'model_providers.proxy.name="OpenAI via agentgateway"' -c 'model_providers.proxy.base_url="http://localhost:4000/v1"'To configure the base URL permanently, add the following to your
~/.codex/config.toml. For more information, see Advanced Configuration. Thenamefield is required for custom providers.[model_providers.proxy] name = "OpenAI via agentgateway" base_url = "http://localhost:4000/v1"