VS Code Continue

Configure Continue, the open-source AI code assistant for VS Code, to route requests through agentgateway.

Before you begin

  • Agentgateway running at http://localhost:3000 with a configured LLM backend.
  • VS Code with the Continue extension installed.

Example agentgateway configuration

# yaml-language-server: $schema=https://agentgateway.dev/schema/config
llm:
  port: 3000
  models:
  - name: "*"
    provider: openAI
    params:
      apiKey: "$OPENAI_API_KEY"

Configure Continue

  1. Edit the ~/.continue/config.json file to add your agentgateway endpoint.
  2. Save the file and reload Continue in VS Code.
{
  "models": [
    {
      "title": "agentgateway",
      "provider": "openai",
      "model": "gpt-4o-mini",
      "apiBase": "http://localhost:3000/v1"
    }
  ]
}

Review the following table to understand this configuration.

FieldDescription
titleDisplay name shown in the Continue model selector.
providerSet to openai for any OpenAI-compatible endpoint.
modelThe model name from your agentgateway backend configuration.
apiBaseYour agentgateway URL with the /v1 path.

Verify the connection

  1. Open the Continue sidebar in VS Code (Cmd + M on macOS, Ctrl + M on Windows/Linux).
  2. Select agentgateway from the model dropdown.
  3. Send a test message: “Hello, are you working?”
Agentgateway assistant

Ask me anything about agentgateway configuration, features, or usage.

Note: AI-generated content might contain errors; please verify and test all returned information.

Tip: one topic per conversation gives the best results. Use the + button in the chat header to start a new conversation.

Switching topics? Starting a new conversation improves accuracy.
↑↓ navigate select esc dismiss

What could be improved?

Your feedback helps us improve assistant answers and identify docs gaps we should fix.

Need more help? Join us on Discord: https://discord.gg/y9efgEmppm

Want to use your own agent? Add the Solo MCP server to query our docs directly. Get started here: https://search.solo.io/.