MCP servers
Point any MCP client (Claude Desktop, Cursor, Zed, VS Code with Continue) at AdaptiveAPI, and every MCP server it talks to now speaks your language. Tool descriptions, tool arguments, tool results, and prompts all flow through the translation pipeline without changes to your client.
Two flavours, same pipeline
MCP servers come in two shapes. AdaptiveAPI handles both, but the wiring differs.
- Remote MCP servers speak HTTP+JSON-RPC. Linear, Atlassian, Sentry, Stripe, Zoom, and most hosted vendors. AdaptiveAPI proxies the URL directly.
- Stdio MCP servers are local-spawned processes. GitHub, GitLab, Slack, Postgres, filesystem, and most community servers. AdaptiveAPI sees only the JSON-RPC bodies through a thin local bridge.
Remote MCP servers
Change the url in your client's MCP config to the AdaptiveAPI
route. The original upstream URL is configured on the route itself, in the
admin UI.
{
"mcpServers": {
"linear-de": {
"url": "https://adaptiveapi.example.com/mcp/rt_yourtenant_xxxxx",
"headers": {
"Authorization": "Bearer <your-linear-oauth-token>"
}
}
}
}
Your OAuth token rides upstream byte identical. AdaptiveAPI never stores
or logs it. Only a SHA-256 fingerprint of the Authorization
header lands in the audit log, never the value itself.
The catalog
The admin UI's /mcp/catalog page ships a curated list of
15 popular servers pre-configured (Linear, Atlassian, Sentry, Stripe, Zoom,
and more). One click gives you a copy-paste snippet for Claude Desktop,
Cursor, Zed, VS Code with Continue, or raw JSON. The catalog file lives at
catalog/mcp-servers.json in the repo and is configurable via
Mcp__CatalogFile.
Stdio MCP servers
Stdio servers are processes the MCP client spawns locally. They typically handle credentials that should never leave your machine (a GitHub PAT, a Postgres connection string, a filesystem mount). AdaptiveAPI does not want to see those credentials, and the translation pipeline does not need them.
The local bridge wraps the existing stdio command, intercepts the JSON-RPC bodies, sends them to AdaptiveAPI's stateless translate API, and forwards the translated bodies to the real upstream process.
# Claude Desktop config (mcp.json)
{
"mcpServers": {
"github-de": {
"command": "npx",
"args": [
"-y", "@adaptiveapi/mcp-bridge",
"--target-lang", "de",
"--upstream", "npx", "-y", "@modelcontextprotocol/server-github"
],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_...",
"ADAPTIVEAPI_TRANSLATE_URL": "https://adaptiveapi.example.com/v1/translate/rt_yourtenant_xxxxx"
}
}
}
}
Your GitHub token never leaves your machine. The MCP server process receives it via env vars, exactly as it does today. Only the JSON-RPC bodies (translated request, translated response) flow through AdaptiveAPI.
Distribution
- npm.
@adaptiveapi/mcp-bridge. Works anywhere Node is installed. - Static Go binary. For environments where you do not want a Node runtime. Single executable, no dependencies.
What gets translated
The translation pipeline walks the JSON-RPC payload and translates only the human-text fields:
- Tool descriptions.
tools/listresponse. The model sees them in source language, your user reads them in target language. - Tool argument descriptions. JSON Schema
descriptionfields. - Tool call arguments. Walked with the same key-aware denylist as OpenAI's
arguments. Identifiers, slugs, and URLs survive verbatim. - Tool results. Text content from
content[*].textentries. - Prompts.
prompts/getmessages.
JSON-RPC envelope (id, method, jsonrpc) is never touched. Result schemas are preserved.
Note. Some MCP servers stream large amounts of data (file listings, search results). AdaptiveAPI only translates the human-text leaves you point at via the per-route
translateJsonPathsconfig. Identifiers, paths, and metadata do not get translated, so downstream code that expects exact strings keeps working.