Skip to main content

Overview

LemonData supports three native API formats with a single API key. Choose the format that best fits your use case - no configuration changes needed.

OpenAI Format

/v1/chat/completions Standard format, widest compatibility

Anthropic Format

/v1/messages Extended thinking, native Claude features

Gemini Format

/v1beta/models/:model:generateContent Google ecosystem integration

Why Multi-Format?

BenefitDescription
No SDK switchingUse any model with your preferred SDK
Native featuresAccess format-specific capabilities
Easy migrationSwitch from official APIs with just a base URL change
Single billingOne account, one API key, all formats

Format Comparison

FeatureOpenAIAnthropicGemini
Endpoint/v1/chat/completions/v1/messages/v1beta/models/:model:generateContent
Auth HeaderAuthorization: Bearerx-api-keyAuthorization: Bearer
System PromptIn messages arraySeparate system fieldIn systemInstruction
Extended Thinking
Streaming✅ SSE✅ SSE✅ SSE
Tool Calling
Vision

OpenAI Format

The most widely compatible format and the default starting point for most new integrations. Works across the broadest set of LemonData models.
from openai import OpenAI

client = OpenAI(
    api_key="sk-your-lemondata-key",
    base_url="https://api.lemondata.cc/v1"
)

# Works with ANY model
response = client.chat.completions.create(
    model="claude-sonnet-4-6",  # Claude via OpenAI format
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
)
Best for:
  • General use
  • Existing OpenAI SDK integrations
  • Maximum compatibility

Anthropic Format

Native Anthropic Messages API. Required for Claude-specific features like extended thinking.
from anthropic import Anthropic

client = Anthropic(
    api_key="sk-your-lemondata-key",
    base_url="https://api.lemondata.cc"  # No /v1 suffix!
)

message = client.messages.create(
    model="claude-sonnet-4-6",
    max_tokens=1024,
    system="You are a helpful assistant.",  # Separate system field
    messages=[
        {"role": "user", "content": "Hello!"}
    ]
)

Extended Thinking (Claude Opus 4.6)

Only available in Anthropic format:
message = client.messages.create(
    model="claude-opus-4-6",
    max_tokens=16000,
    thinking={
        "type": "enabled",
        "budget_tokens": 10000
    },
    messages=[{"role": "user", "content": "Solve this complex problem..."}]
)

# Access thinking process
for block in message.content:
    if block.type == "thinking":
        print(f"Thinking: {block.thinking}")
    elif block.type == "text":
        print(f"Answer: {block.text}")
Best for:
  • Claude-specific features
  • Extended thinking mode
  • Native Anthropic SDK users

Gemini Format

Native Google Gemini API format for Google ecosystem integration.
curl "https://api.lemondata.cc/v1beta/models/gemini-2.5-flash:generateContent" \
  -H "Authorization: Bearer sk-your-lemondata-key" \
  -H "Content-Type: application/json" \
  -d '{
    "contents": [{
      "parts": [{"text": "Hello!"}]
    }],
    "systemInstruction": {
      "parts": [{"text": "You are a helpful assistant."}]
    }
  }'

Streaming

curl "https://api.lemondata.cc/v1beta/models/gemini-2.5-flash:streamGenerateContent?alt=sse" \
  -H "Authorization: Bearer sk-your-lemondata-key" \
  -H "Content-Type: application/json" \
  -d '{
    "contents": [{"parts": [{"text": "Write a story"}]}]
  }'
Best for:
  • Google Cloud integrations
  • Existing Gemini SDK code
  • Native Gemini features

Choosing the Right Format

Migration Guides

From OpenAI Official API

# Before (OpenAI)
client = OpenAI(api_key="sk-openai-key")

# After (LemonData)
client = OpenAI(
    api_key="sk-lemondata-key",
    base_url="https://api.lemondata.cc/v1"  # Add this line
)
# That's it! Same code works

From Anthropic Official API

# Before (Anthropic)
client = Anthropic(api_key="sk-ant-key")

# After (LemonData)
client = Anthropic(
    api_key="sk-lemondata-key",
    base_url="https://api.lemondata.cc"  # Add this line (no /v1!)
)

From Google AI Studio

# Before (Google)
import google.generativeai as genai
genai.configure(api_key="google-api-key")

# After (LemonData) - Use REST API
import requests

response = requests.post(
    "https://api.lemondata.cc/v1beta/models/gemini-2.5-flash:generateContent",
    headers={"Authorization": "Bearer sk-lemondata-key"},
    json={"contents": [{"parts": [{"text": "Hello"}]}]}
)

Cross-Model Compatibility

The magic of LemonData: use any SDK with any model. The gateway automatically handles format conversion.

Any SDK → Any Model

# Anthropic SDK with GPT-4o (auto-converts to OpenAI format)
from anthropic import Anthropic

client = Anthropic(
    api_key="sk-lemondata-key",
    base_url="https://api.lemondata.cc"
)

response = client.messages.create(
    model="gpt-4o",  # ✅ Works! Auto-converted
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}]
)

# Same SDK, different models - no code changes
response = client.messages.create(model="gemini-2.5-flash", ...)  # ✅ Works!
response = client.messages.create(model="deepseek-r1", ...)       # ✅ Works!

OpenAI SDK → All Models

from openai import OpenAI

client = OpenAI(base_url="https://api.lemondata.cc/v1", api_key="sk-...")

# All these work with the same SDK:
response = client.chat.completions.create(model="gpt-4o", ...)
response = client.chat.completions.create(model="claude-sonnet-4-6", ...)
response = client.chat.completions.create(model="gemini-2.5-flash", ...)

Industry Comparison

PlatformOpenAI FormatAnthropic FormatGemini FormatResponses API
LemonData✅ All models✅ All models✅ All models✅ All models
OpenRouter✅ All models
Together AI✅ All models
Fireworks✅ All models
While cross-format works for most features, format-specific features (like Anthropic extended thinking) require the native format.