This guide is for self-hosted OpenClaw users who want to connect LemonData as their AI provider.
Overview
For current OpenClaw versions, the recommended approach is to configure LemonData throughmodels.providers.
If you just want to get started quickly, configuring lemondata alone is enough. Add the other providers only when you need Responses API, Claude native, Gemini native, or MiniMax native behavior.
| Provider | OpenClaw api | Best for | baseUrl |
|---|---|---|---|
lemondata | openai-completions | GPT, DeepSeek, Qwen, and most OpenAI-compatible calls | https://api.lemondata.cc/v1 |
lemondata-responses | openai-responses | OpenAI Responses workflows that expect /v1/responses semantics | https://api.lemondata.cc/v1 |
lemondata-claude | anthropic-messages | Native Claude Messages API | https://api.lemondata.cc |
lemondata-gemini | google-generative-ai | Native Gemini API format | https://api.lemondata.cc |
lemondata-minimax | anthropic-messages | Native MiniMax routing | https://api.lemondata.cc |
Prerequisites
- A self-hosted OpenClaw instance
- A LemonData API Key — Get one here
Configuration
Edit your OpenClaw config:- Self-hosted:
~/.openclaw/openclaw.json
models.providers:
All 5 providers use the same API Key. You only need one LemonData account.
The
models arrays above only show common examples. Add more model IDs to each provider as needed.Using Models
OpenClaw still references models with theprovider/model format:
Model Examples
| Provider | Model reference | Description |
|---|---|---|
lemondata | lemondata/gpt-4o | OpenAI-compatible route |
lemondata | lemondata/deepseek-r1 | DeepSeek reasoning model |
lemondata-responses | lemondata-responses/gpt-4o | Responses API route |
lemondata-claude | lemondata-claude/claude-sonnet-4-6 | Native Claude Messages route |
lemondata-gemini | lemondata-gemini/gemini-2.5-flash | Native Gemini route |
lemondata-minimax | lemondata-minimax/minimax-m1 | Native MiniMax route |
When to Use Which Provider
lemondata: default choice for most general-purpose agent and chat use cases.lemondata-responses: use when your OpenClaw workflow explicitly depends on OpenAI Responses semantics.lemondata-claude: use when you want Claude’s native Messages behavior.lemondata-gemini: use when you want Gemini-native request/response formatting or existing Gemini-style integrations.lemondata-minimax: use when you want MiniMax on its native route.
lemondata/gemini-* on the OpenAI-compatible route.
Common Mistakes
Still using the old top-level providers array
Still using the old top-level providers array
Current OpenClaw docs use
models.providers. If you keep the older top-level providers array format, OpenClaw may ignore the config or fail to resolve the provider prefixes as expected.Forgetting /v1 on lemondata-responses
Forgetting /v1 on lemondata-responses
openai-responses maps to LemonData’s /v1/responses path, so lemondata-responses must use https://api.lemondata.cc/v1.Adding /v1 to lemondata-claude, lemondata-gemini, or lemondata-minimax
Adding /v1 to lemondata-claude, lemondata-gemini, or lemondata-minimax
anthropic-messages and google-generative-ai should use https://api.lemondata.cc without /v1. Adding /v1 can produce incorrect request paths.Does OpenClaw still support native Gemini?
Does OpenClaw still support native Gemini?
Yes. Current OpenClaw documentation still includes the built-in
google provider and also supports custom providers using api: "google-generative-ai". So lemondata-gemini remains a valid native Gemini route for OpenClaw users.Verify Setup
After saving the config, restart your OpenClaw instance and test with a simple message. If you see a response, the provider is configured correctly.Next Steps
Once OpenClaw is connected, these guides help you use LemonData more effectively:- API Formats — understand the differences between OpenAI, Responses, Anthropic, and Gemini routes
- IDE / SDK Compatibility — see when
/v1/responsesis the better fit - Error Handling — learn common failure modes and recovery patterns
- Models Overview — browse model IDs before wiring them into agents