Observability API — Tracing, Metrics, Logging
import "github.com/lookatitude/beluga-ai/o11y"Package o11y provides observability primitives for the Beluga AI framework: OpenTelemetry-based tracing and metrics following GenAI semantic conventions, structured logging via slog, health checks, and LLM-specific trace exporting.
Tracing
Section titled “Tracing”Tracing is built on OpenTelemetry with GenAI semantic convention attributes
(gen_ai.* namespace). StartSpan creates spans with typed attributes,
and InitTracer configures the global OTel tracer provider:
shutdown, err := o11y.InitTracer("my-service", o11y.WithSpanExporter(exporter),)if err != nil { log.Fatal(err)}defer shutdown()
ctx, span := o11y.StartSpan(ctx, "llm.generate", o11y.Attrs{ o11y.AttrRequestModel: "gpt-4o", o11y.AttrSystem: "openai",})defer span.End()The Span interface wraps OTel spans with a simplified API for setting
attributes, recording errors, and setting status codes.
Metrics
Section titled “Metrics”Pre-registered GenAI metric instruments track token usage, operation duration, and estimated cost following OTel conventions:
o11y.TokenUsage(ctx, inputTokens, outputTokens)o11y.OperationDuration(ctx, durationMs)o11y.Cost(ctx, estimatedUSD)InitMeter configures the package-level meter with a service name.
Generic Counter and Histogram functions allow recording custom metrics.
Logging
Section titled “Logging”Logger wraps slog.Logger with context-aware convenience methods and
functional options for configuration:
logger := o11y.NewLogger( o11y.WithLogLevel("debug"), o11y.WithJSON(),)logger.Info(ctx, "request completed", "model", "gpt-4o", "tokens", 150,)Loggers propagate through context via WithLogger and FromContext.
Trace Exporting
Section titled “Trace Exporting”The TraceExporter interface captures detailed LLM call data for analysis
backends. LLMCallData holds the full details of a single invocation
including model, provider, tokens, cost, messages, and response.
MultiExporter fans out to multiple backends simultaneously:
multi := o11y.NewMultiExporter(langfuseExp, phoenixExp)err := multi.ExportLLMCall(ctx, data)Provider implementations include Langfuse, LangSmith, Opik, and Phoenix in the o11y/providers/ subpackages.
Health Checks
Section titled “Health Checks”The HealthChecker interface provides health probes for components.
HealthRegistry aggregates named checkers and runs them concurrently
via HealthRegistry.CheckAll:
registry := o11y.NewHealthRegistry()registry.Register("database", dbChecker)registry.Register("cache", cacheChecker)results := registry.CheckAll(ctx)HealthCheckerFunc adapts plain functions to the HealthChecker interface.
GenAI Attribute Constants
Section titled “GenAI Attribute Constants”The package exports standard GenAI semantic convention attribute keys:
AttrAgentName, AttrOperationName, AttrToolName, AttrRequestModel,
AttrResponseModel, AttrInputTokens, AttrOutputTokens, and AttrSystem.
langfuse
Section titled “langfuse”import "github.com/lookatitude/beluga-ai/o11y/providers/langfuse"Package langfuse provides a Langfuse trace exporter for the Beluga AI observability system. It implements the [o11y.TraceExporter] interface and sends LLM call data to a Langfuse instance via its HTTP ingestion API.
Langfuse is an open-source LLM engineering platform for tracing, evaluation, prompt management, and analytics.
Create an exporter with your Langfuse credentials and use it to export LLM call data:
exporter, err := langfuse.New( langfuse.WithBaseURL("https://cloud.langfuse.com"), langfuse.WithPublicKey("pk-..."), langfuse.WithSecretKey("sk-..."),)if err != nil { log.Fatal(err)}err = exporter.ExportLLMCall(ctx, data)The exporter can be used standalone or composed with other exporters via [o11y.MultiExporter].
Configuration Options
Section titled “Configuration Options”- [WithBaseURL] — sets the Langfuse API base URL (default: https://cloud.langfuse.com)
- [WithPublicKey] — sets the Langfuse public key (required)
- [WithSecretKey] — sets the Langfuse secret key (required)
- [WithTimeout] — sets the HTTP client timeout (default: 10s)
langsmith
Section titled “langsmith”import "github.com/lookatitude/beluga-ai/o11y/providers/langsmith"Package langsmith provides a LangSmith trace exporter for the Beluga AI observability system. It implements the [o11y.TraceExporter] interface and sends LLM call data to LangSmith via its HTTP runs API.
LangSmith is LangChain’s platform for debugging, testing, evaluating, and monitoring LLM applications.
Create an exporter with your LangSmith API key and use it to export LLM call data:
exporter, err := langsmith.New( langsmith.WithAPIKey("lsv2_..."), langsmith.WithProject("my-project"),)if err != nil { log.Fatal(err)}err = exporter.ExportLLMCall(ctx, data)The exporter can be used standalone or composed with other exporters via [o11y.MultiExporter].
Configuration Options
Section titled “Configuration Options”- [WithBaseURL] — sets the LangSmith API base URL (default: https://api.smith.langchain.com)
- [WithAPIKey] — sets the LangSmith API key (required)
- [WithProject] — sets the LangSmith project name (default: “default”)
- [WithTimeout] — sets the HTTP client timeout (default: 10s)
import "github.com/lookatitude/beluga-ai/o11y/providers/opik"Package opik provides an Opik trace exporter for the Beluga AI observability system. It implements the [o11y.TraceExporter] interface and sends LLM call data to Opik via its HTTP tracing API.
Opik (by Comet) provides LLM experiment tracking, tracing, and evaluation.
Create an exporter with your Opik credentials and use it to export LLM call data:
exporter, err := opik.New( opik.WithAPIKey("opik-..."), opik.WithWorkspace("my-workspace"),)if err != nil { log.Fatal(err)}err = exporter.ExportLLMCall(ctx, data)The exporter can be used standalone or composed with other exporters via [o11y.MultiExporter].
Configuration Options
Section titled “Configuration Options”- [WithBaseURL] — sets the Opik API base URL (default: https://www.comet.com/opik/api)
- [WithAPIKey] — sets the Opik API key (required)
- [WithWorkspace] — sets the Opik workspace name (default: “default”)
- [WithTimeout] — sets the HTTP client timeout (default: 10s)
phoenix
Section titled “phoenix”import "github.com/lookatitude/beluga-ai/o11y/providers/phoenix"Package phoenix provides an Arize Phoenix trace exporter for the Beluga AI observability system. It implements the [o11y.TraceExporter] interface and sends LLM call data to an Arize Phoenix instance via its HTTP API.
Phoenix uses OTel-compatible spans, so this exporter translates LLM call data into the Phoenix /v1/traces JSON format.
Create an exporter pointing to your Phoenix instance:
exporter, err := phoenix.New( phoenix.WithBaseURL("http://localhost:6006"),)if err != nil { log.Fatal(err)}err = exporter.ExportLLMCall(ctx, data)The exporter can be used standalone or composed with other exporters via [o11y.MultiExporter].
Configuration Options
Section titled “Configuration Options”- [WithBaseURL] — sets the Phoenix API base URL (default: http://localhost:6006)
- [WithAPIKey] — sets the Phoenix API key for authentication (optional)
- [WithTimeout] — sets the HTTP client timeout (default: 10s)