Skip to main content
1

Add an API Key

Log in at platform.opper.ai and create an API key. Each key is tied to a project — we recommend creating a new project per application or environment.Set it as an environment variable:
export OPPER_API_KEY="your-api-key"
2

Select API

Choose which API format to use. All options route through Opper with full observability.

Opper Multimodal API

Recommended. Built for agents and AI-native applications. Specify tasks, not prompts — 200+ models, all modalities.
Or use a compatible API — all routes add observability, virtual keys, and model routing:

OpenAI

Use the Responses and Chat Completions endpoints

Anthropic

Use the Messages endpoint

Gemini

Use the Generate Content endpoint

Opper Multimodal API

A declarative API built for agents and AI-native applications. Specify tasks, not prompts — the platform handles execution across 200+ models and all modalities with built-in observability, fallbacks, and structured output.
The simplest way to call a model — pass a string in, get a string back.
cURL
curl -X POST https://api.opper.ai/v3/call \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer ${OPPER_API_KEY}" \
  -d '{
    "name": "respond",
    "instructions": "Respond in Swedish",
    "input": "What are the benefits of using an AI gateway?"
    // "model": "openai/gpt-4.1-nano" (optional)
  }'
Example output
{
  "data": "En AI-gateway ger en enda integrationspunkt för flera modeller, med fördelar som automatisk failover, lastbalansering, enhetlig loggning och möjligheten att byta leverantör utan kodändringar.",
  "meta": {
    "function_name": "respond",
    "execution_ms": 486,
    "models_used": ["openai/gpt-4.1-nano"],
    "usage": { "input_tokens": 26, "output_tokens": 293 }
  }
}

Where are the SDKs?

Updated Python and Node SDKs for the v3 API are coming soon. In the meantime, you can use the API directly via cURL or any HTTP client. The previous platform version SDKs are still available:

OpenAI Compatible

Use the OpenAI SDK pointed at Opper’s compatible endpoint. Adds observability, virtual keys, and model routing to every call.
pip install openai
import os
from openai import OpenAI

client = OpenAI(
    base_url="https://api.opper.ai/v3/compat",
    api_key=os.getenv("OPPER_API_KEY"),
)

response = client.chat.completions.create(
    model="openai/gpt-4o-mini",
    messages=[
        {"role": "user", "content": "Hi there!"}
    ],
)

print(response.choices[0].message.content)

Anthropic Compatible

Use the Anthropic SDK pointed at Opper’s Messages endpoint. Adds observability, virtual keys, and model routing to every call.
pip install anthropic
import os
import anthropic

client = anthropic.Anthropic(
    base_url="https://api.opper.ai/v3/compat",
    api_key=os.getenv("OPPER_API_KEY"),
)

message = client.messages.create(
    model="anthropic/claude-sonnet-4.5",
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Hi there!"}
    ],
)

print(message.content[0].text)

Gemini Compatible

Use the Google GenAI SDK pointed at Opper’s Interactions endpoint. Adds observability, virtual keys, and model routing to every call.
pip install google-genai
import os
from google import genai

client = genai.Client(
    api_key=os.getenv("OPPER_API_KEY"),
    http_options={"base_url": "https://api.opper.ai/v3/compat"},
)

response = client.models.generate_content(
    model="gcp/gemini-2.5-flash-eu",
    contents="Hi there!",
)

print(response.text)