API Documentation
Everything you need to integrate with Opulon API
Overview
Opulon API is a Claude-compatible AI gateway. Point your existing Claude integration to our endpoint, swap in your Opulon API key and everything works instantly , no code rewrites, no new SDKs.
We support the Messages API with full streaming support, all current Claude models and the same request/response format you already use.
Quickstart
1Get your API key
Purchase a plan and get your API key from the Opulon Status Page. Your key will look like op-ul-e4st85dasf.
2Set the base URL
export ANTHROPIC_API_KEY="op-ul-e4st85dasf"
export ANTHROPIC_BASE_URL="https://api.opulonapi.com"3Make your first request
curl https://api.opulonapi.com/v1/messages \
--header "x-api-key: $ANTHROPIC_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data '{
"model": "claude-sonnet-4-20250514",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, Claude"}
]
}'Response
{
"id": "msg_01XFDUDYJgAACzvnptvVoYEL",
"type": "message",
"role": "assistant",
"content": [
{
"type": "text",
"text": "Hello! How can I assist you today?"
}
],
"model": "claude-sonnet-4-20250514",
"stop_reason": "end_turn",
"usage": {
"input_tokens": 12,
"output_tokens": 8
}
}Authentication
All requests must include these headers:
| Header | Value | Description |
|---|---|---|
x-api-key | op-ul-e4st85dasf | Your Opulon API key |
anthropic-version | 2023-06-01 | API version string |
content-type | application/json | Request body format |
If you're using the official Anthropic SDK, set the API key and base URL , the SDK handles headers automatically.
Base URL
https://api.opulonapi.comReplace Anthropic's default base URL (https://api.anthropic.com) with the Opulon endpoint. All paths remain identical.
Python (Anthropic SDK)
import anthropic
client = anthropic.Anthropic(
api_key="op-ul-e4st85dasf",
base_url="https://api.opulonapi.com"
)
message = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello, Claude"}
]
)
print(message.content[0].text)TypeScript (Anthropic SDK)
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
apiKey: "op-ul-e4st85dasf",
baseURL: "https://api.opulonapi.com"
});
const message = await client.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [
{ role: "user", content: "Hello, Claude" }
]
});
console.log(message.content[0].text);OpenAI SDK (Compatible)
from openai import OpenAI
client = OpenAI(
api_key="op-ul-e4st85dasf",
base_url="https://api.opulonapi.com/v1"
)
response = client.chat.completions.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello, Claude"}
]
)
print(response.choices[0].message.content)Messages API
POST /v1/messagesSend a structured list of messages to Claude and receive a response. Supports multi-turn conversations, system prompts and tool use.
Request Body
| Parameter | Type | Required | Description |
|---|---|---|---|
model | string | Yes | Model ID to use (e.g. claude-sonnet-4-20250514) |
messages | array | Yes | List of message objects with role and content |
max_tokens | integer | Yes | Maximum tokens to generate |
system | string | No | System prompt to set context |
temperature | float | No | Randomness (0.0–1.0, default 1.0) |
top_p | float | No | Nucleus sampling threshold |
stream | boolean | No | Enable server-sent events streaming |
stop_sequences | array | No | Custom stop sequences |
Streaming
Set stream: true to receive responses as server-sent events (SSE). Tokens are delivered as they are generated.
curl https://api.opulonapi.com/v1/messages \
--header "x-api-key: $ANTHROPIC_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data '{
"model": "claude-sonnet-4-20250514",
"max_tokens": 1024,
"stream": true,
"messages": [
{"role": "user", "content": "Write a haiku about APIs"}
]
}'Event Types
| Event | Description |
|---|---|
message_start | Contains the message object with metadata |
content_block_start | Start of a content block |
content_block_delta | Incremental text content |
content_block_stop | End of a content block |
message_delta | Stop reason and final usage |
message_stop | End of the message |
Models
The following models are available through Opulon API. Use the model ID in your requests.
| Model | Model ID | Max Output | Best For |
|---|---|---|---|
| Claude Opus 4 | claude-opus-4-20250514 | 32,000 | Complex reasoning, research |
| Claude Sonnet 4 | claude-sonnet-4-20250514 | 16,000 | Balanced speed & quality |
| Claude Haiku 3.5 | claude-3-5-haiku-20241022 | 8,192 | Fast, cost-efficient tasks |
All models support 200K token context windows. Model availability may vary by plan.
Rate Limits
Rate limits are configured per API key based on your plan. When limits are reached, the API returns a 429 status code.
| Limit Type | Description |
|---|---|
| Requests per minute | Maximum number of API calls per minute (varies by plan) |
| Token budget | Rolling 5-hour window token limit (input + output combined) |
View your current usage and remaining limits in your status page.
Error Handling
Errors are returned as JSON with an error object:
{
"type": "error",
"error": {
"type": "invalid_request_error",
"message": "model: field required"
}
}HTTP Status Codes
| Code | Type | Description |
|---|---|---|
| 400 | invalid_request_error | Invalid or missing parameters |
| 401 | authentication_error | Invalid API key |
| 403 | permission_error | Key does not have access to the requested resource |
| 404 | not_found_error | Requested resource not found |
| 429 | rate_limit_error | Rate limit or token budget exceeded |
| 500 | api_error | Internal server error |
| 529 | overloaded_error | Upstream provider is overloaded |
SDKs & Libraries
Since Opulon is fully Claude-compatible, you can use the official Anthropic SDKs. Just set the base URL to https://api.opulonapi.com.
Python
pip install anthropic
TypeScript
npm install @anthropic-ai/sdk
Java
Maven / Gradle
Go
go get github.com/anthropics/anthropic-sdk-go
You can also use the OpenAI SDK with Opulon , set the base URL to https://api.opulonapi.com/v1 and use your Opulon key as the API key.