ai_gateway · bridge Product type

MCP Server

Expose any AI Gateway endpoint as an MCP tool. Agents (Claude Desktop, custom MCP clients) call your endpoints via JSON-RPC at POST /api/{uuid}/mcp — with the full guardrail, audit, and routing pipeline applied to every invocation.

Feature, not a project type

The MCP Server is a feature on AI Gateway projects, not a separate project type. Toggle expose_as_mcp_tool=true on any AI Gateway endpoint and the project's /mcp route serves it as a JSON-RPC tool.

WHAT'S IN THE BOX

JSON-RPC over HTTP, full pipeline

Every MCP tool call goes through PromptGate's standard guardrail + audit + routing pipeline. The bridge is just the wire format.

Expose endpoints as tools

One toggle on the endpoint. Tool name = endpoint slug. Tool description + schema come from the endpoint's metadata.

Standard JSON-RPC methods

initialize, tools/list, tools/call, ping, batch requests, notifications. Full MCP wire-protocol compliance.

Full guardrail pipeline

tools/call delegates to GatewayService: PII filter, secret scanner, prompt-injection guard, schema validation, rate limits, budgets — all enforced.

Scoped auth

Dedicated mcp token scope. Agents that need MCP get a token with only that scope; can't call /v1/chat/completions directly.

Audited as MCP-native

Every tool call lands in gateway_logs with via=mcp_bridge. The audit story is identical to direct API calls.

Endpoint detail surfaces it

Each MCP-exposed endpoint shows tool name + bridge URL + curl example on its detail page. Required scope is documented inline.

EXAMPLE

Connect Claude Desktop to your endpoints

Drop the MCP server config into Claude Desktop's claude_desktop_config.json. Every endpoint with expose_as_mcp_tool=true shows up in Claude's tool drawer.

  • Tool list refreshes when you flip the toggle in PromptGate's UI
  • Per-tool guardrails apply on every invocation
  • Cost shows up in the same dashboard as your direct API calls
claude_desktop_config.json json
{
  "mcpServers": {
    "promptgate": {
      "url": "https://promptgate.your.co/api/<uuid>/mcp",
      "headers": {
        "Authorization": "Bearer pg_live_..."
      }
    }
  }
}
DECISION HELPER

Use this when…

Use the MCP Server feature when

  • You've already built AI Gateway endpoints and want to expose them to agents
  • You want one MCP-server URL with N tools instead of N standalone server processes
  • You want guardrails / audit / cost on every MCP tool call
  • Your agents are MCP-aware (Claude Desktop, custom JSON-RPC clients)

Pick something else when

  • You want to aggregate existing MCP servers behind one endpoint → MCP Gateway
  • You're shipping an LLM-backed feature without MCP exposure → AI Gateway alone is enough
  • You want OpenAI / Anthropic SDK clients → Agent Proxy

Already running an AI Gateway?

Toggle expose_as_mcp_tool on any endpoint. Your AI Gateway project is now also an MCP server. No new project, no new infra.

Install Community Edition Expose-as-tool recipe ↗