Skip to content
Docs

LangSmith Observability Provider

The LangSmith provider exports LLM call data to LangSmith, LangChain’s platform for debugging, testing, and monitoring LLM applications. It implements the o11y.TraceExporter interface and sends run data through the LangSmith batch API.

Choose LangSmith when you want a managed platform with strong debugging and testing capabilities for LLM applications. LangSmith organizes traces into projects and provides detailed run inspection with inputs, outputs, and token usage. For an open-source self-hostable alternative, consider Langfuse. For workspace-level analytics, consider Opik.

Terminal window
go get github.com/lookatitude/beluga-ai/o11y/providers/langsmith
OptionTypeDefaultDescription
WithAPIKey(key)string(required)LangSmith API key (lsv2_...)
WithProject(name)string"default"LangSmith project name
WithBaseURL(url)stringhttps://api.smith.langchain.comAPI endpoint
WithTimeout(d)time.Duration10sHTTP request timeout

Environment variables:

VariableMaps to
LANGSMITH_API_KEYWithAPIKey
package main
import (
"context"
"log"
"os"
"time"
"github.com/lookatitude/beluga-ai/o11y"
"github.com/lookatitude/beluga-ai/o11y/providers/langsmith"
)
func main() {
exporter, err := langsmith.New(
langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")),
langsmith.WithProject("my-project"),
)
if err != nil {
log.Fatal(err)
}
err = exporter.ExportLLMCall(context.Background(), o11y.LLMCallData{
Model: "gpt-4o",
Provider: "openai",
InputTokens: 500,
OutputTokens: 150,
Duration: 1200 * time.Millisecond,
Cost: 0.003,
Response: "The capital of France is Paris.",
Metadata: map[string]any{
"temperature": 0.7,
"max_tokens": 2048,
},
})
if err != nil {
log.Fatal(err)
}
}

LangSmith organizes traces into projects. Use the WithProject option to route traces to a specific project:

// Development traces
devExporter, err := langsmith.New(
langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")),
langsmith.WithProject("my-app-dev"),
)
// Production traces
prodExporter, err := langsmith.New(
langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")),
langsmith.WithProject("my-app-prod"),
)

Each ExportLLMCall creates a LangSmith “run” with:

  • Run type: "llm"
  • Name: Formatted as "provider/model" (e.g., "openai/gpt-4o")
  • Inputs: The message history sent to the model
  • Outputs: The model’s response
  • Extras: Token counts, cost, duration, and any additional metadata
  • Session name: The configured project name

Combine LangSmith with other observability providers:

import (
"github.com/lookatitude/beluga-ai/o11y"
"github.com/lookatitude/beluga-ai/o11y/providers/langsmith"
"github.com/lookatitude/beluga-ai/o11y/providers/langfuse"
)
lsExporter, err := langsmith.New(
langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")),
)
if err != nil {
log.Fatal(err)
}
lfExporter, err := langfuse.New(
langfuse.WithPublicKey(os.Getenv("LANGFUSE_PUBLIC_KEY")),
langfuse.WithSecretKey(os.Getenv("LANGFUSE_SECRET_KEY")),
)
if err != nil {
log.Fatal(err)
}
multi := o11y.NewMultiExporter(lsExporter, lfExporter)

LangSmith uses API key authentication via the x-api-key HTTP header. API keys are prefixed with lsv2_.

err = exporter.ExportLLMCall(ctx, data)
if err != nil {
// Errors include authentication failures, network issues, and API errors
log.Printf("LangSmith export failed: %v", err)
}

The Flush method is a no-op since the provider sends data synchronously via HTTP.