LangSmith Observability Provider
The LangSmith provider exports LLM call data to LangSmith, LangChain’s platform for debugging, testing, and monitoring LLM applications. It implements the o11y.TraceExporter interface and sends run data through the LangSmith batch API.
Choose LangSmith when you want a managed platform with strong debugging and testing capabilities for LLM applications. LangSmith organizes traces into projects and provides detailed run inspection with inputs, outputs, and token usage. For an open-source self-hostable alternative, consider Langfuse. For workspace-level analytics, consider Opik.
Installation
Section titled “Installation”go get github.com/lookatitude/beluga-ai/o11y/providers/langsmithConfiguration
Section titled “Configuration”| Option | Type | Default | Description |
|---|---|---|---|
WithAPIKey(key) | string | (required) | LangSmith API key (lsv2_...) |
WithProject(name) | string | "default" | LangSmith project name |
WithBaseURL(url) | string | https://api.smith.langchain.com | API endpoint |
WithTimeout(d) | time.Duration | 10s | HTTP request timeout |
Environment variables:
| Variable | Maps to |
|---|---|
LANGSMITH_API_KEY | WithAPIKey |
Basic Usage
Section titled “Basic Usage”package main
import ( "context" "log" "os" "time"
"github.com/lookatitude/beluga-ai/o11y" "github.com/lookatitude/beluga-ai/o11y/providers/langsmith")
func main() { exporter, err := langsmith.New( langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")), langsmith.WithProject("my-project"), ) if err != nil { log.Fatal(err) }
err = exporter.ExportLLMCall(context.Background(), o11y.LLMCallData{ Model: "gpt-4o", Provider: "openai", InputTokens: 500, OutputTokens: 150, Duration: 1200 * time.Millisecond, Cost: 0.003, Response: "The capital of France is Paris.", Metadata: map[string]any{ "temperature": 0.7, "max_tokens": 2048, }, }) if err != nil { log.Fatal(err) }}Project Organization
Section titled “Project Organization”LangSmith organizes traces into projects. Use the WithProject option to route traces to a specific project:
// Development tracesdevExporter, err := langsmith.New( langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")), langsmith.WithProject("my-app-dev"),)
// Production tracesprodExporter, err := langsmith.New( langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")), langsmith.WithProject("my-app-prod"),)Run Model
Section titled “Run Model”Each ExportLLMCall creates a LangSmith “run” with:
- Run type:
"llm" - Name: Formatted as
"provider/model"(e.g.,"openai/gpt-4o") - Inputs: The message history sent to the model
- Outputs: The model’s response
- Extras: Token counts, cost, duration, and any additional metadata
- Session name: The configured project name
With MultiExporter
Section titled “With MultiExporter”Combine LangSmith with other observability providers:
import ( "github.com/lookatitude/beluga-ai/o11y" "github.com/lookatitude/beluga-ai/o11y/providers/langsmith" "github.com/lookatitude/beluga-ai/o11y/providers/langfuse")
lsExporter, err := langsmith.New( langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")),)if err != nil { log.Fatal(err)}
lfExporter, err := langfuse.New( langfuse.WithPublicKey(os.Getenv("LANGFUSE_PUBLIC_KEY")), langfuse.WithSecretKey(os.Getenv("LANGFUSE_SECRET_KEY")),)if err != nil { log.Fatal(err)}
multi := o11y.NewMultiExporter(lsExporter, lfExporter)Authentication
Section titled “Authentication”LangSmith uses API key authentication via the x-api-key HTTP header. API keys are prefixed with lsv2_.
Error Handling
Section titled “Error Handling”err = exporter.ExportLLMCall(ctx, data)if err != nil { // Errors include authentication failures, network issues, and API errors log.Printf("LangSmith export failed: %v", err)}The Flush method is a no-op since the provider sends data synchronously via HTTP.