Integrate

LLM providers

OpenAI and Anthropic ride on typed wire-format surfaces. Change one URL, keep your SDK. Streaming, tool calls, and JSON shape all survive the round trip. No changes to retries, auth, or response parsing.

OpenAI

The OpenAI surface mounts under /v1/<route>/ and speaks the full Chat Completions and Responses APIs.

from openai import OpenAI

client = OpenAI(
    api_key="sk-...",
    base_url="https://adaptiveapi.example.com/v1/rt_yourtenant_xxxxx",
)

resp = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user",
               "content": "Was ist eine Hashmap?"}],
    extra_headers={"X-AdaptiveApi-Target-Lang": "de"},
    stream=True,
)
for chunk in resp:
    print(chunk.choices[0].delta.content or "", end="")

What is supported

What you change

One thing. The base URL.

Everything else (your retries, your timeouts, your error handling, your streaming consumer) stays untouched.

Anthropic

The Anthropic surface mounts under /anthropic/v1/<route>/ and speaks the Messages API.

import anthropic

client = anthropic.Anthropic(
    api_key="sk-ant-...",
    base_url="https://adaptiveapi.example.com/anthropic/v1/rt_yourtenant_xxxxx",
)

message = client.messages.create(
    model="claude-3-7-sonnet-latest",
    max_tokens=1024,
    system="Tu es un assistant serviable.",
    messages=[{"role": "user",
               "content": "Explique-moi le théorème de Pythagore."}],
    extra_headers={"X-AdaptiveApi-Target-Lang": "fr"},
)

What is supported

Other providers

Cohere, Mistral, Azure OpenAI, Together, Groq, your local vLLM, and anything else with an HTTP+JSON shape goes through the Generic JSON API instead. You describe which JSON paths carry human text and AdaptiveAPI handles the round trip.

Tip. If a provider implements an OpenAI-compatible endpoint (Together, Groq, Mistral La Plateforme, vLLM, etc.), you can often use the OpenAI surface above by setting the route's upstream to their base URL. Saves writing a generic-route definition.

Headers and overrides

Every X-AdaptiveApi-* header is stripped before the request reaches the upstream. They never leak into your model context. The headers you can use:

HeaderExamplePurpose
X-AdaptiveApi-Target-LangdeWhat the caller speaks.
X-AdaptiveApi-Source-LangenWhat the model thinks in.
X-AdaptiveApi-ModebidirectionalDirection toggle.
X-AdaptiveApi-Style-Rules_marketingApply a style rule. See Style rules.
X-AdaptiveApi-Glossaryg_defaultApply a glossary.

Full list lives in Routes and tokens.