Now in private beta

Your coding agent.
Every model, one key.

Point Claude Code, Cursor, Cline, Aider — or any OpenAI SDK — at qlaud. We route to DeepSeek, Groq, OpenAI, Anthropic and friends. One key, one base URL, prepaid wallet, no subscriptions.

$5 starter credit. No credit card to sign up.

One key, every model

Stop managing five accounts. One qlaud key reaches DeepSeek, Claude, GPT, Llama, Qwen — across both Anthropic and OpenAI SDKs.

Faster than going direct

Cloudflare's edge sits closer to upstream POPs than your laptop does. Cached prompts return in ~50ms. Tree-search agents pay for the same lookup once, not ten times.

$0.001 per coding turn

DeepSeek V3 via qlaud: ~$0.001/turn. Claude direct: ~$0.018/turn. Top up $5 for hundreds of agent sessions. No subscription.

Time to first token, head to head.

Lower is better. Cache hits are essentially free for repeated prompts — measured at the closest Cloudflare POP.

qlaud (cache hit)
50ms
qlaud → DeepSeek
280ms
DeepSeek direct
310ms
OpenRouter → DeepSeek
410ms
Claude direct
430ms

Approximate, p50, single-prompt warm path. Live measurements from the next probe land here automatically. Full methodology in the blog.

Drop-in. Both worlds.

Anthropic-compatible AND OpenAI-compatible — pick whichever your tool speaks.

Claude Code, Cursor, Cline, Aider…
# Two env vars. Done.
export ANTHROPIC_BASE_URL=https://api.qlaud.ai
export ANTHROPIC_API_KEY=ak_live_...

claude  # or cursor, cline, aider — same key works for all
openai-py, LangChain, Vercel AI SDK…
from openai import OpenAI

client = OpenAI(
    base_url="https://api.qlaud.ai/v1",
    api_key="ak_live_...",
)
# Works with Vercel AI SDK, LangChain, LiteLLM, openai-node — same shape.

$5 in your wallet, two minutes to first request.

Sign up, mint a key, swap two env vars.