AI21 Jamba via VerticalAPI

AI21 Jamba 1.6 family (open-weights, hybrid Mamba/Transformer, 256K context) via VerticalAPI's OpenAI-compatible endpoint. BYOK, zero markup, Studio or self-hosted.

Endpoint: https://api.verticalapi.com/v1/chat/completions  ·  BYOK header: X-Provider-Key: <ai21-jamba-key>

AI21 Jamba models routed by VerticalAPI

Pass the model ID below as model in any OpenAI-compatible request. New AI21 Jamba models are typically supported within 24h of release.

Model IDNameContextPricing (provider)
jamba-1.6-large Jamba 1.6 Large 256K $2 / $8 per 1M tok
jamba-1.6-mini Jamba 1.6 Mini 256K $0.20 / $0.40 per 1M tok

Pricing reflects AI21 Jamba's rates — you pay AI21 Jamba directly. VerticalAPI adds zero markup on tokens.

5-line AI21 Jamba call via VerticalAPI

Drop-in replacement for the OpenAI SDK. Works with the OpenAI Python client, Node, Go, curl — anything that speaks HTTP.

jamba_quickstart.py Python
from openai import OpenAI

client = OpenAI(
    base_url="https://api.verticalapi.com/v1",
    api_key="vapi_...",
    default_headers={"X-Provider-Key": "..."}
)

response = client.chat.completions.create(
    model="jamba-1.6-mini",  # AI21 Jamba
    messages=[{"role": "user", "content": "Hello"}]
)
print(response.choices[0].message.content)

Four reasons developers route AI21 Jamba through us

Zero token markup

You pay AI21 Jamba directly with your own key. VerticalAPI's revenue is the gateway subscription, not a tax on your tokens.

One key, every provider

AI21 Jamba alongside OpenAI, Anthropic, Gemini and 12 more — same OpenAI-compatible endpoint, same SDK, switchable per-request.

Latency & cost monitoring

Per-request token counts, p50/p95 latency and cost dashboards out of the box. Compare AI21 Jamba to other providers on identical prompts.

Observability built in

Every AI21 Jamba call gets a trace ID, replayable payload and audit log entry. Wire to Datadog or Sentry via OpenTelemetry.

Where AI21 Jamba shines

long-doc QA at low cost structured JSON output self-hosted (open weights) compliance-friendly fine-tunes

Common questions about AI21 Jamba on VerticalAPI

How is Jamba different from Jamba 1.5 (the AI21 vertical)?

Jamba 1.6 is the open-weights successor to Jamba 1.5 — same hybrid architecture but with improved benchmarks and the option to self-host. The AI21 vertical covers Studio-hosted Jamba 1.5; this Jamba vertical covers the open-weights release. <!-- TODO Hugo: confirm 1.6 GA timeline -->