π LiteLLM Integration
LiteLLM (GitHub (opens in a new tab)): Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs).
You can find more in-depth documentation in the LiteLLM docs (opens in a new tab).
There are three ways to integrate LiteLLM with Langfuse:
- LiteLLM Proxy with OpenAI SDK Wrapper, the proxy standardizes 100+ models on the OpenAI API schema and the Langfuse OpenAI SDK wrapper instruments the LLM calls.
- LiteLLM Python SDK which can send logs to Langfuse if the environment variables are set.
- LiteLLM Proxy which can send logs to Langfuse if enabled in the UI.
Integration
The LiteLLM proxy is a simple way to integrate LiteLLM with Langfuse. You can set the success and failure callbacks to Langfuse to log all responses.
To see a full end-to-end example, check out the LiteLLM cookbook.