Windsurf

Verified Code examples on this page have been automatically tested and verified.

Configure Windsurf, the AI code editor by Codeium, to route requests to your LLM through your agentgateway proxy.

Before you begin

  • Agentgateway running at http://localhost:3000 with a configured LLM backend.
  • Windsurf installed.

Example agentgateway configuration

cat > /tmp/test-windsurf.yaml << 'EOF'
# yaml-language-server: $schema=https://agentgateway.dev/schema/config
llm:
  port: 3000
  models:
  - name: "*"
    provider: openAI
    params:
      apiKey: "$OPENAI_API_KEY"
EOF

Configure Windsurf

Configure Windsurf to route LLM requests through agentgateway. For more information, review the Windsurf documentation.

  1. Open Windsurf Settings.

    • macOS: Cmd + , or Windsurf > Settings
    • Windows/Linux: Ctrl + , or File > Preferences > Settings
  2. Search for Http: Proxy.

  3. Enter your agentgateway URL.

    http://localhost:3000
  4. Save the settings.

Verify the connection

  1. Open the Windsurf chat panel.
  2. Send a message such as “test”.
  3. Windsurf responds through your agentgateway backend.
Agentgateway assistant

Ask me anything about agentgateway configuration, features, or usage.

Note: AI-generated content might contain errors; please verify and test all returned information.

Tip: one topic per conversation gives the best results. Use the + button in the chat header to start a new conversation.

Switching topics? Starting a new conversation improves accuracy.
↑↓ navigate select esc dismiss

What could be improved?

Your feedback helps us improve assistant answers and identify docs gaps we should fix.

Need more help? Join us on Discord: https://discord.gg/y9efgEmppm

Want to use your own agent? Add the Solo MCP server to query our docs directly. Get started here: https://search.solo.io/.