ClaudeStore/Docs
⚠ These docs are a work in progress. Some content may be inaccurate or temporarily AI-generated.

OpenAI-Compatible Claude API — Use Claude with OpenAI SDK

Access Claude models through an OpenAI-compatible API endpoint. Use the OpenAI Python/TypeScript SDK with Claude Opus, Sonnet, and Haiku — just change the base URL.

TL;DR

ClaudeStore provides an OpenAI-compatible endpoint at https://api2.claudestore.store/v1. Use the OpenAI SDK, change the base URL, and access all Claude models — no code rewrite needed.

Why OpenAI Compatibility Matters

Most AI tools, frameworks, and IDEs were built around the OpenAI API format (/v1/chat/completions). If you have existing code using the OpenAI SDK, migrating to Claude normally means rewriting your API calls to use Anthropic's Messages format.

ClaudeStore's OpenAI-compatible endpoint eliminates this problem. You get Claude's superior coding and reasoning capabilities while keeping your existing OpenAI-format code.

Quick Migration: OpenAI → Claude

Before (OpenAI)

Existing OpenAI codepython
from openai import OpenAI

client = OpenAI(api_key="sk-openai-key")

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Write a Python function"}]
)
print(response.choices[0].message.content)

After (Claude via ClaudeStore)

Same code, now using Claudepython
from openai import OpenAI

client = OpenAI(
    api_key="YOUR_CLAUDESTORE_KEY",
    base_url="https://api2.claudestore.store/v1"  # Only change!
)

response = client.chat.completions.create(
    model="claude-sonnet-4.6",  # Use Claude model ID
    messages=[{"role": "user", "content": "Write a Python function"}]
)
print(response.choices[0].message.content)

That's it — two lines changed: base URL and model name.

TypeScript / Node.js Example

TypeScript with OpenAI SDKtypescript
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "YOUR_CLAUDESTORE_KEY",
  baseURL: "https://api2.claudestore.store/v1",
});

const response = await client.chat.completions.create({
  model: "claude-sonnet-4.6",
  messages: [{ role: "user", content: "Explain async/await" }],
});

console.log(response.choices[0].message.content);

Supported OpenAI Endpoints

  • POST /v1/chat/completions — Chat completions (streaming & non-streaming)
  • GET /v1/models — List available Claude models
The endpoint supports streaming (stream: true), system messages, multi-turn conversations, and function calling — all in OpenAI format.

Framework & Tool Compatibility

The OpenAI-compatible endpoint works with any tool that supports custom OpenAI base URLs:

  • LangChain — use ChatOpenAI with custom base URL
  • LlamaIndex — configure OpenAI-compatible LLM
  • Cursor IDE — set Override OpenAI Base URL
  • Continue (VS Code) — set base URL in config
  • AutoGen — use OpenAI-compatible config
  • CrewAI — configure as OpenAI provider

Model Mapping

When migrating from OpenAI, here's the recommended Claude equivalent:

OpenAI ModelClaude EquivalentBest For
GPT-4o / GPT-4claude-sonnet-4.6General use, coding
o1 / o1-proclaude-opus-4-20250514Complex reasoning
GPT-4o-miniclaude-haiku-3-5-20241022Fast, cheap tasks

Native Anthropic API Also Available

If you prefer the native Anthropic Messages API format, ClaudeStore supports that too:

Native Anthropic formatpython
import anthropic

client = anthropic.Anthropic(
    api_key="YOUR_CLAUDESTORE_KEY",
    base_url="https://api2.claudestore.store"
)

message = client.messages.create(
    model="claude-sonnet-4.6",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}]
)

Use whichever format fits your project best — both hit the same Claude models at the same pricing.

Ready to start?

Get API access to all Claude models in under 2 minutes.

View Plans