AI21 Jamba via VerticalAPI
AI21 Jamba 1.6 family (open-weights, hybrid Mamba/Transformer, 256K context) via VerticalAPI's OpenAI-compatible endpoint. BYOK, zero markup, Studio or self-hosted.
AI21 Jamba models routed by VerticalAPI
Pass the model ID below as model in any OpenAI-compatible request. New AI21 Jamba models are typically supported within 24h of release.
| Model ID | Name | Context | Pricing (provider) |
|---|---|---|---|
jamba-1.6-large |
Jamba 1.6 Large | 256K | $2 / $8 per 1M tok |
jamba-1.6-mini |
Jamba 1.6 Mini | 256K | $0.20 / $0.40 per 1M tok |
Pricing reflects AI21 Jamba's rates — you pay AI21 Jamba directly. VerticalAPI adds zero markup on tokens.
5-line AI21 Jamba call via VerticalAPI
Drop-in replacement for the OpenAI SDK. Works with the OpenAI Python client, Node, Go, curl — anything that speaks HTTP.
from openai import OpenAI client = OpenAI( base_url="https://api.verticalapi.com/v1", api_key="vapi_...", default_headers={"X-Provider-Key": "..."} ) response = client.chat.completions.create( model="jamba-1.6-mini", # AI21 Jamba messages=[{"role": "user", "content": "Hello"}] ) print(response.choices[0].message.content)
Four reasons developers route AI21 Jamba through us
Zero token markup
You pay AI21 Jamba directly with your own key. VerticalAPI's revenue is the gateway subscription, not a tax on your tokens.
One key, every provider
AI21 Jamba alongside OpenAI, Anthropic, Gemini and 12 more — same OpenAI-compatible endpoint, same SDK, switchable per-request.
Latency & cost monitoring
Per-request token counts, p50/p95 latency and cost dashboards out of the box. Compare AI21 Jamba to other providers on identical prompts.
Observability built in
Every AI21 Jamba call gets a trace ID, replayable payload and audit log entry. Wire to Datadog or Sentry via OpenTelemetry.
Where AI21 Jamba shines
Common questions about AI21 Jamba on VerticalAPI
How is Jamba different from Jamba 1.5 (the AI21 vertical)?
Jamba 1.6 is the open-weights successor to Jamba 1.5 — same hybrid architecture but with improved benchmarks and the option to self-host. The AI21 vertical covers Studio-hosted Jamba 1.5; this Jamba vertical covers the open-weights release. <!-- TODO Hugo: confirm 1.6 GA timeline -->
All supported LLM providers
Same endpoint, same SDK — just change the model and the BYOK header.
Ship on AI21 Jamba in 60 seconds
Free tier — bring your own AI21 Jamba key, zero markup, OpenAI-compatible endpoint.
Get your VerticalAPI key →