Skip to content
Docs

Langfuse Observability Provider

The Langfuse provider exports LLM call data to Langfuse, an open-source LLM observability platform. It implements the o11y.TraceExporter interface and sends trace and generation events through the Langfuse batch ingestion API.

Choose Langfuse when you want an open-source observability platform that can be self-hosted or used as a managed cloud service. Langfuse provides trace and generation analytics with a dashboard for cost tracking and latency monitoring. For LangChain ecosystem integration, consider LangSmith. For OTel-native local debugging, consider Phoenix.

Terminal window
go get github.com/lookatitude/beluga-ai/o11y/providers/langfuse
OptionTypeDefaultDescription
WithPublicKey(key)string(required)Langfuse public key
WithSecretKey(key)string(required)Langfuse secret key
WithBaseURL(url)stringhttps://cloud.langfuse.comLangfuse API endpoint
WithTimeout(d)time.Duration10sHTTP request timeout

Environment variables:

VariableMaps to
LANGFUSE_PUBLIC_KEYWithPublicKey
LANGFUSE_SECRET_KEYWithSecretKey
package main
import (
"context"
"log"
"os"
"time"
"github.com/lookatitude/beluga-ai/o11y"
"github.com/lookatitude/beluga-ai/o11y/providers/langfuse"
)
func main() {
exporter, err := langfuse.New(
langfuse.WithPublicKey(os.Getenv("LANGFUSE_PUBLIC_KEY")),
langfuse.WithSecretKey(os.Getenv("LANGFUSE_SECRET_KEY")),
)
if err != nil {
log.Fatal(err)
}
err = exporter.ExportLLMCall(context.Background(), o11y.LLMCallData{
Model: "gpt-4o",
Provider: "openai",
InputTokens: 500,
OutputTokens: 150,
Duration: 1200 * time.Millisecond,
Cost: 0.003,
Response: "The capital of France is Paris.",
Metadata: map[string]any{
"user_id": "user-123",
"session_id": "sess-456",
},
})
if err != nil {
log.Fatal(err)
}
}

Point the exporter to a self-hosted Langfuse instance:

exporter, err := langfuse.New(
langfuse.WithPublicKey(os.Getenv("LANGFUSE_PUBLIC_KEY")),
langfuse.WithSecretKey(os.Getenv("LANGFUSE_SECRET_KEY")),
langfuse.WithBaseURL("https://langfuse.internal.example.com"),
)

Combine Langfuse with other observability providers:

import (
"github.com/lookatitude/beluga-ai/o11y"
"github.com/lookatitude/beluga-ai/o11y/providers/langfuse"
"github.com/lookatitude/beluga-ai/o11y/providers/phoenix"
)
lfExporter, err := langfuse.New(
langfuse.WithPublicKey(os.Getenv("LANGFUSE_PUBLIC_KEY")),
langfuse.WithSecretKey(os.Getenv("LANGFUSE_SECRET_KEY")),
)
if err != nil {
log.Fatal(err)
}
pxExporter, err := phoenix.New(
phoenix.WithBaseURL("http://localhost:6006"),
)
if err != nil {
log.Fatal(err)
}
multi := o11y.NewMultiExporter(lfExporter, pxExporter)

Each ExportLLMCall creates two events in Langfuse’s batch ingestion API:

  1. trace-create — A top-level trace capturing the operation name, metadata, and timestamps
  2. generation-create — A child generation event linked to the trace, containing the model, token usage, cost, and duration

This structure provides both a high-level view of operations and detailed per-generation analytics in the Langfuse dashboard.

Langfuse uses Basic authentication. The provider encodes the public key and secret key as base64(publicKey:secretKey) and sends it in the Authorization header.

err = exporter.ExportLLMCall(ctx, data)
if err != nil {
// Errors include authentication failures, network issues, and API errors
log.Printf("Langfuse export failed: %v", err)
}

The Flush method is a no-op since the provider sends data synchronously via HTTP.