Documentation Index
Fetch the complete documentation index at: https://docs.lemondata.cc/llms.txt
Use this file to discover all available pages before exploring further.
Overview
LemonData works well with LangChain’sChatOpenAI and OpenAIEmbeddings integrations when you stay on the standard OpenAI-compatible chat and embeddings surface.
This page intentionally covers the standard OpenAI-compatible LangChain surface, not provider-native LangChain features beyond that surface.
Current LangChain docs note that
ChatOpenAI targets official OpenAI-compatible request/response shapes. If you need provider-specific, non-standard response fields, use a provider-specific LangChain integration instead of relying on ChatOpenAI.Type: Framework or PlatformPrimary Path: OpenAI-compatible standard surfaceSupport Confidence: Supported standard surface
Installation
Basic Configuration
Using Different Models
Message History
Streaming
Embeddings
Simple RAG Example
Agents
For new agentic projects, LangChain recommends considering LangGraph for more explicit control over long-running and tool-using workflows.
Best Practices
Pass base_url explicitly
Pass base_url explicitly
The most reliable LemonData setup is to pass
base_url="https://api.lemondata.cc/v1" directly to ChatOpenAI and OpenAIEmbeddings instead of depending on older environment-variable aliases.Use standard features here
Use standard features here
Stick to standard chat, tool calling, streaming, and embeddings on
ChatOpenAI. If you need vendor-native extras, switch to the vendor’s own LangChain integration.Use cheaper models for retrieval
Use cheaper models for retrieval
Use embedding models like
text-embedding-3-small for retrieval and reserve stronger chat models for the final answer step.