Skip to main content

Overview

LemonData supports three native API formats with a single API key. Choose the format that best fits your use case - no configuration changes needed.

OpenAI Format

/v1/chat/completions Standard format, widest compatibility

Anthropic Format

/v1/messages Extended thinking, native Claude features

Gemini Format

/v1beta/models/:model:generateContent Google ecosystem integration

Why Multi-Format?

BenefitDescription
No SDK switchingUse any model with your preferred SDK
Native featuresAccess format-specific capabilities
Easy migrationSwitch from official APIs with just a base URL change
Single billingOne account, one API key, all formats

Format Comparison

FeatureOpenAIAnthropicGemini
Endpoint/v1/chat/completions/v1/messages/v1beta/models/:model:generateContent
Auth HeaderAuthorization: Bearerx-api-keyAuthorization: Bearer
System PromptIn messages arraySeparate system fieldIn systemInstruction
Extended Thinking
Streaming✅ SSE✅ SSE✅ SSE
Tool Calling
Vision

OpenAI Format

The most widely compatible format. Works with all LemonData models.
from openai import OpenAI

client = OpenAI(
    api_key="sk-your-lemondata-key",
    base_url="https://api.lemondata.cc/v1"
)

# Works with ANY model
response = client.chat.completions.create(
    model="claude-3-5-sonnet-20241022",  # Claude via OpenAI format
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
)
Best for:
  • General use
  • Existing OpenAI SDK integrations
  • Maximum compatibility

Anthropic Format

Native Anthropic Messages API. Required for Claude-specific features like extended thinking.
from anthropic import Anthropic

client = Anthropic(
    api_key="sk-your-lemondata-key",
    base_url="https://api.lemondata.cc"  # No /v1 suffix!
)

message = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    system="You are a helpful assistant.",  # Separate system field
    messages=[
        {"role": "user", "content": "Hello!"}
    ]
)

Extended Thinking (Claude 3.7+)

Only available in Anthropic format:
message = client.messages.create(
    model="claude-3-7-sonnet-20250219-thinking",
    max_tokens=16000,
    thinking={
        "type": "enabled",
        "budget_tokens": 10000
    },
    messages=[{"role": "user", "content": "Solve this complex problem..."}]
)

# Access thinking process
for block in message.content:
    if block.type == "thinking":
        print(f"Thinking: {block.thinking}")
    elif block.type == "text":
        print(f"Answer: {block.text}")
Best for:
  • Claude-specific features
  • Extended thinking mode
  • Native Anthropic SDK users

Gemini Format

Native Google Gemini API format for Google ecosystem integration.
curl "https://api.lemondata.cc/v1beta/models/gemini-2.5-flash:generateContent" \
  -H "Authorization: Bearer sk-your-lemondata-key" \
  -H "Content-Type: application/json" \
  -d '{
    "contents": [{
      "parts": [{"text": "Hello!"}]
    }],
    "systemInstruction": {
      "parts": [{"text": "You are a helpful assistant."}]
    }
  }'

Streaming

curl "https://api.lemondata.cc/v1beta/models/gemini-2.5-flash:streamGenerateContent?alt=sse" \
  -H "Authorization: Bearer sk-your-lemondata-key" \
  -H "Content-Type: application/json" \
  -d '{
    "contents": [{"parts": [{"text": "Write a story"}]}]
  }'
Best for:
  • Google Cloud integrations
  • Existing Gemini SDK code
  • Native Gemini features

Choosing the Right Format

Migration Guides

From OpenAI Official API

# Before (OpenAI)
client = OpenAI(api_key="sk-openai-key")

# After (LemonData)
client = OpenAI(
    api_key="sk-lemondata-key",
    base_url="https://api.lemondata.cc/v1"  # Add this line
)
# That's it! Same code works

From Anthropic Official API

# Before (Anthropic)
client = Anthropic(api_key="sk-ant-key")

# After (LemonData)
client = Anthropic(
    api_key="sk-lemondata-key",
    base_url="https://api.lemondata.cc"  # Add this line (no /v1!)
)

From Google AI Studio

# Before (Google)
import google.generativeai as genai
genai.configure(api_key="google-api-key")

# After (LemonData) - Use REST API
import requests

response = requests.post(
    "https://api.lemondata.cc/v1beta/models/gemini-2.5-flash:generateContent",
    headers={"Authorization": "Bearer sk-lemondata-key"},
    json={"contents": [{"parts": [{"text": "Hello"}]}]}
)

Cross-Model Compatibility

The magic of LemonData: use any model with any format:
# OpenAI format with Claude model
client = OpenAI(base_url="https://api.lemondata.cc/v1", api_key="sk-...")
response = client.chat.completions.create(
    model="claude-3-5-sonnet-20241022",  # ✅ Works!
    messages=[{"role": "user", "content": "Hello"}]
)

# OpenAI format with Gemini model
response = client.chat.completions.create(
    model="gemini-2.5-flash",  # ✅ Works!
    messages=[{"role": "user", "content": "Hello"}]
)
While cross-format works for most features, format-specific features (like Anthropic extended thinking) require the native format.