OpenAI-Compatible Claude API — Use Claude with OpenAI SDK
Access Claude models through an OpenAI-compatible API endpoint. Use the OpenAI Python/TypeScript SDK with Claude Opus, Sonnet, and Haiku — just change the base URL.
TL;DR
https://api2.claudestore.store/v1. Use the OpenAI SDK, change the base URL, and access all Claude models — no code rewrite needed.Why OpenAI Compatibility Matters
Most AI tools, frameworks, and IDEs were built around the OpenAI API format (/v1/chat/completions). If you have existing code using the OpenAI SDK, migrating to Claude normally means rewriting your API calls to use Anthropic's Messages format.
ClaudeStore's OpenAI-compatible endpoint eliminates this problem. You get Claude's superior coding and reasoning capabilities while keeping your existing OpenAI-format code.
Quick Migration: OpenAI → Claude
Before (OpenAI)
from openai import OpenAI
client = OpenAI(api_key="sk-openai-key")
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Write a Python function"}]
)
print(response.choices[0].message.content)After (Claude via ClaudeStore)
from openai import OpenAI
client = OpenAI(
api_key="YOUR_CLAUDESTORE_KEY",
base_url="https://api2.claudestore.store/v1" # Only change!
)
response = client.chat.completions.create(
model="claude-sonnet-4.6", # Use Claude model ID
messages=[{"role": "user", "content": "Write a Python function"}]
)
print(response.choices[0].message.content)That's it — two lines changed: base URL and model name.
TypeScript / Node.js Example
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "YOUR_CLAUDESTORE_KEY",
baseURL: "https://api2.claudestore.store/v1",
});
const response = await client.chat.completions.create({
model: "claude-sonnet-4.6",
messages: [{ role: "user", content: "Explain async/await" }],
});
console.log(response.choices[0].message.content);Supported OpenAI Endpoints
- POST /v1/chat/completions — Chat completions (streaming & non-streaming)
- GET /v1/models — List available Claude models
stream: true), system messages, multi-turn conversations, and function calling — all in OpenAI format.Framework & Tool Compatibility
The OpenAI-compatible endpoint works with any tool that supports custom OpenAI base URLs:
- LangChain — use
ChatOpenAIwith custom base URL - LlamaIndex — configure OpenAI-compatible LLM
- Cursor IDE — set Override OpenAI Base URL
- Continue (VS Code) — set base URL in config
- AutoGen — use OpenAI-compatible config
- CrewAI — configure as OpenAI provider
Model Mapping
When migrating from OpenAI, here's the recommended Claude equivalent:
| OpenAI Model | Claude Equivalent | Best For |
|---|---|---|
| GPT-4o / GPT-4 | claude-sonnet-4.6 | General use, coding |
| o1 / o1-pro | claude-opus-4-20250514 | Complex reasoning |
| GPT-4o-mini | claude-haiku-3-5-20241022 | Fast, cheap tasks |
Native Anthropic API Also Available
If you prefer the native Anthropic Messages API format, ClaudeStore supports that too:
import anthropic
client = anthropic.Anthropic(
api_key="YOUR_CLAUDESTORE_KEY",
base_url="https://api2.claudestore.store"
)
message = client.messages.create(
model="claude-sonnet-4.6",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}]
)Use whichever format fits your project best — both hit the same Claude models at the same pricing.