LemonData works well with LangChain’s ChatOpenAI and OpenAIEmbeddings integrations when you stay on the standard OpenAI-compatible chat and embeddings surface.
Current LangChain docs note that ChatOpenAI targets official OpenAI-compatible request/response shapes. If you need provider-specific, non-standard response fields, use a provider-specific LangChain integration instead of relying on ChatOpenAI.
from langchain_openai import ChatOpenAIllm = ChatOpenAI( model="gpt-5.4", api_key="sk-your-lemondata-key", base_url="https://api.lemondata.cc/v1",)response = llm.invoke("Explain LemonData in one sentence.")print(response.content)
from langchain_core.messages import HumanMessage, SystemMessagemessages = [ SystemMessage(content="You are a helpful assistant."), HumanMessage(content="What is the capital of France?")]response = llm.invoke(messages)print(response.content)
The most reliable LemonData setup is to pass base_url="https://api.lemondata.cc/v1" directly to ChatOpenAI and OpenAIEmbeddings instead of depending on older environment-variable aliases.
Use standard features here
Stick to standard chat, tool calling, streaming, and embeddings on ChatOpenAI. If you need vendor-native extras, switch to the vendor’s own LangChain integration.
Use cheaper models for retrieval
Use embedding models like text-embedding-3-small for retrieval and reserve stronger chat models for the final answer step.