OpenRouter via VerticalAPI

Route through OpenRouter's aggregated catalog (Claude, GPT, Llama, DeepSeek, Qwen) via VerticalAPI's OpenAI-compatible endpoint. BYOK with your OpenRouter key, zero markup, automatic provider failover.

Endpoint: https://api.verticalapi.com/v1/chat/completions  ·  BYOK header: X-Provider-Key: sk-or-...

OpenRouter models routed by VerticalAPI

Pass the model ID below as model in any OpenAI-compatible request. New OpenRouter models are typically supported within 24h of release.

Model IDNameContextPricing (provider)
anthropic/claude-sonnet-4.5 Claude Sonnet 4.5 (OR) 200K $3 / $15 per 1M tok + ~5% OR fee
openai/gpt-4o GPT-4o (OR) 128K $2.50 / $10 + ~5% OR fee
deepseek/deepseek-chat-v3 DeepSeek V3 (OR) 64K $0.27 / $1.10 per 1M tok
meta-llama/llama-3.3-70b-instruct Llama 3.3 70B (OR) 128K $0.30 / $0.50 per 1M tok

Pricing reflects OpenRouter's rates — you pay OpenRouter directly. VerticalAPI adds zero markup on tokens.

5-line OpenRouter call via VerticalAPI

Drop-in replacement for the OpenAI SDK. Works with the OpenAI Python client, Node, Go, curl — anything that speaks HTTP.

openrouter_quickstart.py Python
from openai import OpenAI

client = OpenAI(
    base_url="https://api.verticalapi.com/v1",
    api_key="vapi_...",
    default_headers={"X-Provider-Key": "sk-or-..."}
)

response = client.chat.completions.create(
    model="anthropic/claude-sonnet-4.5",  # OpenRouter
    messages=[{"role": "user", "content": "Hello"}]
)
print(response.choices[0].message.content)

Four reasons developers route OpenRouter through us

Zero token markup

You pay OpenRouter directly with your own key. VerticalAPI's revenue is the gateway subscription, not a tax on your tokens.

One key, every provider

OpenRouter alongside OpenAI, Anthropic, Gemini and 12 more — same OpenAI-compatible endpoint, same SDK, switchable per-request.

Latency & cost monitoring

Per-request token counts, p50/p95 latency and cost dashboards out of the box. Compare OpenRouter to other providers on identical prompts.

Observability built in

Every OpenRouter call gets a trace ID, replayable payload and audit log entry. Wire to Datadog or Sentry via OpenTelemetry.

Where OpenRouter shines

model A/B testing automatic failover rare model access single-key sprawl reduction

Common questions about OpenRouter on VerticalAPI

Why use OpenRouter via VerticalAPI?

OpenRouter aggregates 300+ models behind one key, but adds ~5% markup and limited observability. VerticalAPI gives you the same aggregation plus per-request traces, dashboards and the ability to swap to direct provider keys without code changes.

Can I switch from OpenRouter to direct providers later?

Yes — VerticalAPI is designed for this. Add an OpenAI key, change the X-Provider-Key header, keep the same model name. No SDK migration.