Skip to main content
This guide is for self-hosted OpenClaw users who want to connect LemonData as their AI provider.
Want a hosted setup instead? Create a LemonClaw instance — LemonData is already configured by default, so you can start with the WebUI right away.

Overview

For current OpenClaw versions, the recommended approach is to configure LemonData through models.providers. If you just want to get started quickly, configuring lemondata alone is enough. Add the other providers only when you need Responses API, Claude native, Gemini native, or MiniMax native behavior.
ProviderOpenClaw apiBest forbaseUrl
lemondataopenai-completionsGPT, DeepSeek, Qwen, and most OpenAI-compatible callshttps://api.lemondata.cc/v1
lemondata-responsesopenai-responsesOpenAI Responses workflows that expect /v1/responses semanticshttps://api.lemondata.cc/v1
lemondata-claudeanthropic-messagesNative Claude Messages APIhttps://api.lemondata.cc
lemondata-geminigoogle-generative-aiNative Gemini API formathttps://api.lemondata.cc
lemondata-minimaxanthropic-messagesNative MiniMax routinghttps://api.lemondata.cc
Use the /v1 suffix only for openai-completions and openai-responses.Native providers such as anthropic-messages and google-generative-ai should use https://api.lemondata.cc without /v1, otherwise OpenClaw may construct the wrong upstream path.

Prerequisites

  • A self-hosted OpenClaw instance
  • A LemonData API Key — Get one here

Configuration

Edit your OpenClaw config:
  • Self-hosted: ~/.openclaw/openclaw.json
Add LemonData providers under models.providers:
{
  agents: {
    defaults: {
      model: {
        primary: "lemondata-claude/claude-sonnet-4-6"
      }
    }
  },
  models: {
    mode: "merge",
    providers: {
      lemondata: {
        api: "openai-completions",
        baseUrl: "https://api.lemondata.cc/v1",
        apiKey: "sk-your-lemondata-key",
        models: [
          { id: "gpt-4o", name: "GPT-4o" },
          { id: "deepseek-r1", name: "DeepSeek R1" },
          { id: "qwen3-32b", name: "Qwen 3 32B" }
        ]
      },
      "lemondata-responses": {
        api: "openai-responses",
        baseUrl: "https://api.lemondata.cc/v1",
        apiKey: "sk-your-lemondata-key",
        models: [
          { id: "gpt-4o", name: "GPT-4o (Responses)" },
          { id: "gpt-5.2", name: "GPT-5.2 (Responses)" }
        ]
      },
      "lemondata-claude": {
        api: "anthropic-messages",
        baseUrl: "https://api.lemondata.cc",
        apiKey: "sk-your-lemondata-key",
        models: [
          { id: "claude-sonnet-4-6", name: "Claude Sonnet 4.6" },
          { id: "claude-opus-4-6", name: "Claude Opus 4.6" }
        ]
      },
      "lemondata-gemini": {
        api: "google-generative-ai",
        baseUrl: "https://api.lemondata.cc",
        apiKey: "sk-your-lemondata-key",
        models: [
          { id: "gemini-2.5-flash", name: "Gemini 2.5 Flash" },
          { id: "gemini-3-flash-preview", name: "Gemini 3 Flash Preview" }
        ]
      },
      "lemondata-minimax": {
        api: "anthropic-messages",
        baseUrl: "https://api.lemondata.cc",
        apiKey: "sk-your-lemondata-key",
        models: [
          { id: "minimax-m1", name: "MiniMax M1" }
        ]
      }
    }
  }
}
All 5 providers use the same API Key. You only need one LemonData account.
The models arrays above only show common examples. Add more model IDs to each provider as needed.

Using Models

OpenClaw still references models with the provider/model format:
{
  agents: {
    defaults: {
      model: {
        primary: "lemondata-gemini/gemini-2.5-flash"
      }
    }
  }
}

Model Examples

ProviderModel referenceDescription
lemondatalemondata/gpt-4oOpenAI-compatible route
lemondatalemondata/deepseek-r1DeepSeek reasoning model
lemondata-responseslemondata-responses/gpt-4oResponses API route
lemondata-claudelemondata-claude/claude-sonnet-4-6Native Claude Messages route
lemondata-geminilemondata-gemini/gemini-2.5-flashNative Gemini route
lemondata-minimaxlemondata-minimax/minimax-m1Native MiniMax route
Browse all available models at lemondata.cc/models.

When to Use Which Provider

  • lemondata: default choice for most general-purpose agent and chat use cases.
  • lemondata-responses: use when your OpenClaw workflow explicitly depends on OpenAI Responses semantics.
  • lemondata-claude: use when you want Claude’s native Messages behavior.
  • lemondata-gemini: use when you want Gemini-native request/response formatting or existing Gemini-style integrations.
  • lemondata-minimax: use when you want MiniMax on its native route.
If you do not need Gemini-native behavior, you can still call Gemini models through lemondata/gemini-* on the OpenAI-compatible route.

Common Mistakes

Current OpenClaw docs use models.providers. If you keep the older top-level providers array format, OpenClaw may ignore the config or fail to resolve the provider prefixes as expected.
openai-responses maps to LemonData’s /v1/responses path, so lemondata-responses must use https://api.lemondata.cc/v1.
anthropic-messages and google-generative-ai should use https://api.lemondata.cc without /v1. Adding /v1 can produce incorrect request paths.
Yes. Current OpenClaw documentation still includes the built-in google provider and also supports custom providers using api: "google-generative-ai". So lemondata-gemini remains a valid native Gemini route for OpenClaw users.

Verify Setup

After saving the config, restart your OpenClaw instance and test with a simple message. If you see a response, the provider is configured correctly.
# Self-hosted: restart the service
systemctl --user restart openclaw    # Linux
launchctl stop cc.lemondata.openclaw && launchctl start cc.lemondata.openclaw  # macOS

Next Steps

Once OpenClaw is connected, these guides help you use LemonData more effectively: