Authentication
AdaptiveAPI handles three sets of credentials. Some you set once in configuration. Some travel with each request. Some are issued by the admin UI. This page explains where each one goes, who sees it, and how it is stored.
The three layers
- Translator-backend keys. DeepL or LLM-translator API key. Set once in config. Stored encrypted at rest.
- Upstream provider keys. OpenAI, Anthropic, MCP server tokens. Travel with each request. Never stored. Forwarded byte identical.
- Route tokens. Issued by the admin UI. Identify which route the caller is using. Argon2id-hashed at rest.
The admin UI itself is a fourth layer (OIDC for production, optional open dev mode for local).
Translator-backend keys
AdaptiveAPI calls a translator backend on your behalf. You configure that
key once. It lives in deploy/.env for Docker, in your
appsettings.json or environment variables for source builds,
and in a Kubernetes Secret for Helm.
DeepL
DeepL is the default recommended translator backend. You need an account with the DeepL API. Free and paid tiers use different base URLs.
LLMTRANS_DEFAULT_TRANSLATOR=deepl
DEEPL_API_KEY="your-deepl-api-key"
DEEPL_BASE_URL=https://api.deepl.com/ # paid tier
# DEEPL_BASE_URL=https://api-free.deepl.com/ # free tier
Get a key at www.deepl.com/pro-api. The free tier is rate
limited and capped at 500,000 characters per month. The paid tier has no
monthly cap and pricing is per character.
LLM translator
As an alternative to DeepL, you can use any OpenAI-compatible endpoint as the translator. Useful when you already have an LLM provider relationship and want a single bill, or when you want a self-hosted translator like vLLM.
LLMTRANS_DEFAULT_TRANSLATOR=llm
LLM_TRANSLATOR_API_KEY="sk-..."
LLM_TRANSLATOR_BASE_URL=https://api.openai.com/
LLM_TRANSLATOR_MODEL=gpt-4o-mini
# or self-hosted:
# LLM_TRANSLATOR_BASE_URL=http://vllm.internal:8000/
# LLM_TRANSLATOR_MODEL=meta-llama/Meta-Llama-3.1-70B-Instruct
Passthrough (no translator)
The default. Useful while wiring up routes for the first time. The pipeline runs but leaves text unchanged. No key needed.
LLMTRANS_DEFAULT_TRANSLATOR=passthrough
How translator keys are stored
In configuration, keys are read from environment variables or an
appsettings.json file. Anything you put into the database via
the admin UI (per-tenant overrides) is encrypted at rest using the
framework data-protection key.
Upstream provider keys
The keys for OpenAI, Anthropic, and similar are your application's
keys. AdaptiveAPI does not have its own. It forwards whatever the caller
sends in the Authorization header, byte identical, to the
upstream.
OpenAI
Pass your real OpenAI API key as api_key in the SDK. The SDK
sends it as Authorization: Bearer sk-.... AdaptiveAPI forwards
that header upstream untouched.
from openai import OpenAI
client = OpenAI(
api_key="sk-...", # your real OpenAI key
base_url="https://adaptiveapi.example.com/v1/rt_yourtenant_xxxxx",
)
Anthropic
Same pattern. The Anthropic SDK uses the x-api-key header
rather than Authorization. Both are forwarded.
import anthropic
client = anthropic.Anthropic(
api_key="sk-ant-...",
base_url="https://adaptiveapi.example.com/anthropic/v1/rt_yourtenant_xxxxx",
)
MCP server tokens
Remote MCP servers usually authenticate with OAuth Bearer tokens. Put your
token in the headers block of the MCP client config. It rides
upstream untouched.
{
"mcpServers": {
"linear-de": {
"url": "https://adaptiveapi.example.com/mcp/rt_yourtenant_xxxxx",
"headers": {
"Authorization": "Bearer <your-linear-oauth-token>"
}
}
}
}
Stdio MCP servers receive their credentials via env vars on the local machine, exactly as they do today. AdaptiveAPI never sees them.
Other providers (Cohere, Mistral, custom)
Same idea via the generic adapter. The inbound Authorization
header is forwarded to the upstream URL configured on the route. See
Generic JSON API.
How upstream keys are handled
- Never stored. Upstream keys are not persisted in the database.
- Never logged in plain text. The audit log stores a SHA-256 fingerprint of the
Authorizationvalue, never the value itself. Useful for abuse correlation, useless for impersonation. - Never sent to the translator. Translation calls go to DeepL or your LLM translator, with their own keys. Upstream keys are not part of that flow.
Route tokens
Route tokens are issued by the admin UI when you create a route. They look
like rt_yourtenant_xxxxx. They identify which route the call
is using. They are not provider credentials.
Tokens are hashed with Argon2id before storage. The plain value is shown to you exactly once at creation time. After that, the admin UI only displays the prefix and a fingerprint. Lose it, rotate it.
For local development you can seed a stable token at startup with
Dev__FixedRouteToken=rt_dev_LOCALDEMO. Local only. Do not set
this in production.
Admin UI authentication
The admin UI mounts at /admin. Two modes:
- OIDC. The production default. Configure with
Auth__Provider=oidcand the standard OIDC discovery URL plus a client secret. Works with Azure AD, Auth0, Keycloak, Okta, anything OIDC-compliant. - Open dev mode.
Auth__Provider=none. Useful for local development. The UI is open. Do not use this in production.
Optional: Presidio sidecar
The PII redactor's higher-recall mode runs Microsoft Presidio Analyzer as
an HTTP sidecar. It typically does not require auth on the internal
network. If you put it behind one, AdaptiveAPI forwards a configured
bearer token via PiiRedactor__Presidio__AuthHeader.
Quick reference
| Credential | Where it goes | How it is stored |
|---|---|---|
| DeepL API key | DEEPL_API_KEY env var | Encrypted at rest in DB if set via UI. |
| LLM-translator key | LLM_TRANSLATOR_API_KEY env var | Encrypted at rest in DB if set via UI. |
| OpenAI / Anthropic / Cohere key | Inbound Authorization header | Never stored. SHA-256 fingerprint in audit only. |
| MCP OAuth token | Inbound Authorization header | Never stored. SHA-256 fingerprint in audit only. |
| Route token | URL path segment | Argon2id hash in DB. |
| Admin UI session | OIDC provider | Issued by your IdP. AdaptiveAPI verifies signature. |
Threat-model summary. A compromised AdaptiveAPI database leaks: route token hashes (useless without preimage), translator-backend keys (rotate them), audit metadata (no bodies, no plaintext upstream tokens). It does not leak any caller's OpenAI key, Anthropic key, MCP token, or message content.