Start here

Authentication

AdaptiveAPI handles three sets of credentials. Some you set once in configuration. Some travel with each request. Some are issued by the admin UI. This page explains where each one goes, who sees it, and how it is stored.

The three layers

  1. Translator-backend keys. DeepL or LLM-translator API key. Set once in config. Stored encrypted at rest.
  2. Upstream provider keys. OpenAI, Anthropic, MCP server tokens. Travel with each request. Never stored. Forwarded byte identical.
  3. Route tokens. Issued by the admin UI. Identify which route the caller is using. Argon2id-hashed at rest.

The admin UI itself is a fourth layer (OIDC for production, optional open dev mode for local).

Translator-backend keys

AdaptiveAPI calls a translator backend on your behalf. You configure that key once. It lives in deploy/.env for Docker, in your appsettings.json or environment variables for source builds, and in a Kubernetes Secret for Helm.

DeepL

DeepL is the default recommended translator backend. You need an account with the DeepL API. Free and paid tiers use different base URLs.

LLMTRANS_DEFAULT_TRANSLATOR=deepl
DEEPL_API_KEY="your-deepl-api-key"
DEEPL_BASE_URL=https://api.deepl.com/         # paid tier
# DEEPL_BASE_URL=https://api-free.deepl.com/  # free tier

Get a key at www.deepl.com/pro-api. The free tier is rate limited and capped at 500,000 characters per month. The paid tier has no monthly cap and pricing is per character.

LLM translator

As an alternative to DeepL, you can use any OpenAI-compatible endpoint as the translator. Useful when you already have an LLM provider relationship and want a single bill, or when you want a self-hosted translator like vLLM.

LLMTRANS_DEFAULT_TRANSLATOR=llm
LLM_TRANSLATOR_API_KEY="sk-..."
LLM_TRANSLATOR_BASE_URL=https://api.openai.com/
LLM_TRANSLATOR_MODEL=gpt-4o-mini

# or self-hosted:
# LLM_TRANSLATOR_BASE_URL=http://vllm.internal:8000/
# LLM_TRANSLATOR_MODEL=meta-llama/Meta-Llama-3.1-70B-Instruct

Passthrough (no translator)

The default. Useful while wiring up routes for the first time. The pipeline runs but leaves text unchanged. No key needed.

LLMTRANS_DEFAULT_TRANSLATOR=passthrough

How translator keys are stored

In configuration, keys are read from environment variables or an appsettings.json file. Anything you put into the database via the admin UI (per-tenant overrides) is encrypted at rest using the framework data-protection key.

Upstream provider keys

The keys for OpenAI, Anthropic, and similar are your application's keys. AdaptiveAPI does not have its own. It forwards whatever the caller sends in the Authorization header, byte identical, to the upstream.

OpenAI

Pass your real OpenAI API key as api_key in the SDK. The SDK sends it as Authorization: Bearer sk-.... AdaptiveAPI forwards that header upstream untouched.

from openai import OpenAI

client = OpenAI(
    api_key="sk-...",                    # your real OpenAI key
    base_url="https://adaptiveapi.example.com/v1/rt_yourtenant_xxxxx",
)

Anthropic

Same pattern. The Anthropic SDK uses the x-api-key header rather than Authorization. Both are forwarded.

import anthropic

client = anthropic.Anthropic(
    api_key="sk-ant-...",
    base_url="https://adaptiveapi.example.com/anthropic/v1/rt_yourtenant_xxxxx",
)

MCP server tokens

Remote MCP servers usually authenticate with OAuth Bearer tokens. Put your token in the headers block of the MCP client config. It rides upstream untouched.

{
  "mcpServers": {
    "linear-de": {
      "url": "https://adaptiveapi.example.com/mcp/rt_yourtenant_xxxxx",
      "headers": {
        "Authorization": "Bearer <your-linear-oauth-token>"
      }
    }
  }
}

Stdio MCP servers receive their credentials via env vars on the local machine, exactly as they do today. AdaptiveAPI never sees them.

Other providers (Cohere, Mistral, custom)

Same idea via the generic adapter. The inbound Authorization header is forwarded to the upstream URL configured on the route. See Generic JSON API.

How upstream keys are handled

Route tokens

Route tokens are issued by the admin UI when you create a route. They look like rt_yourtenant_xxxxx. They identify which route the call is using. They are not provider credentials.

Tokens are hashed with Argon2id before storage. The plain value is shown to you exactly once at creation time. After that, the admin UI only displays the prefix and a fingerprint. Lose it, rotate it.

For local development you can seed a stable token at startup with Dev__FixedRouteToken=rt_dev_LOCALDEMO. Local only. Do not set this in production.

Admin UI authentication

The admin UI mounts at /admin. Two modes:

Optional: Presidio sidecar

The PII redactor's higher-recall mode runs Microsoft Presidio Analyzer as an HTTP sidecar. It typically does not require auth on the internal network. If you put it behind one, AdaptiveAPI forwards a configured bearer token via PiiRedactor__Presidio__AuthHeader.

Quick reference

CredentialWhere it goesHow it is stored
DeepL API keyDEEPL_API_KEY env varEncrypted at rest in DB if set via UI.
LLM-translator keyLLM_TRANSLATOR_API_KEY env varEncrypted at rest in DB if set via UI.
OpenAI / Anthropic / Cohere keyInbound Authorization headerNever stored. SHA-256 fingerprint in audit only.
MCP OAuth tokenInbound Authorization headerNever stored. SHA-256 fingerprint in audit only.
Route tokenURL path segmentArgon2id hash in DB.
Admin UI sessionOIDC providerIssued by your IdP. AdaptiveAPI verifies signature.

Threat-model summary. A compromised AdaptiveAPI database leaks: route token hashes (useless without preimage), translator-backend keys (rotate them), audit metadata (no bodies, no plaintext upstream tokens). It does not leak any caller's OpenAI key, Anthropic key, MCP token, or message content.