Cursor

Configure Cursor, the AI code editor, to route requests to your LLM through your agentgateway proxy.

Before you begin

  • Agentgateway running at http://localhost:3000 with a configured LLM backend.
  • Cursor installed (version 0.30 or later).

Example agentgateway configuration

# yaml-language-server: $schema=https://agentgateway.dev/schema/config
llm:
  port: 3000
  models:
  - name: "*"
    provider: openAI
    params:
      apiKey: "$OPENAI_API_KEY"

Configure Cursor

  1. Open the Cursor Settings.

    • macOS: Cmd + , or Cursor > Settings
    • Windows/Linux: Ctrl + , or File > Preferences > Settings
  2. Navigate to the Models tab.

  3. Enable Override OpenAI Base URL and enter your agentgateway address.

    http://localhost:3000
    ℹ️
    You do not need to provide LLM provider credentials (such as an API key) through Cursor. The credentials are configured in agentgateway. Toggle off any API key overrides in Cursor.

Verify the connection

  1. Open the Cursor chat panel (Cmd + L on macOS, Ctrl + L on Windows/Linux).
  2. Send a message such as “test”.
  3. Cursor responds through your agentgateway backend.
Agentgateway assistant

Ask me anything about agentgateway configuration, features, or usage.

Note: AI-generated content might contain errors; please verify and test all returned information.

Tip: one topic per conversation gives the best results. Use the + button in the chat header to start a new conversation.

Switching topics? Starting a new conversation improves accuracy.
↑↓ navigate select esc dismiss

What could be improved?

Your feedback helps us improve assistant answers and identify docs gaps we should fix.

Need more help? Join us on Discord: https://discord.gg/y9efgEmppm

Want to use your own agent? Add the Solo MCP server to query our docs directly. Get started here: https://search.solo.io/.