Skip to content
Docs

Phoenix Observability Provider

The Phoenix provider exports LLM call data to Phoenix, Arize AI’s open-source LLM observability platform. It implements the o11y.TraceExporter interface and sends OTel-compatible span data through the Phoenix traces API.

Choose Phoenix when you want a local-first, OTel-native observability tool for development and debugging. Phoenix runs locally with no API keys required, making it ideal for rapid iteration. It uses OpenTelemetry-compatible span formats, aligning with Beluga’s OTel GenAI conventions. For production observability with team features, consider Langfuse or LangSmith.

Terminal window
go get github.com/lookatitude/beluga-ai/o11y/providers/phoenix
OptionTypeDefaultDescription
WithBaseURL(url)stringhttp://localhost:6006Phoenix server endpoint
WithAPIKey(key)stringOptional bearer token for authentication
WithTimeout(d)time.Duration10sHTTP request timeout
package main
import (
"context"
"log"
"time"
"github.com/lookatitude/beluga-ai/o11y"
"github.com/lookatitude/beluga-ai/o11y/providers/phoenix"
)
func main() {
exporter, err := phoenix.New(
phoenix.WithBaseURL("http://localhost:6006"),
)
if err != nil {
log.Fatal(err)
}
err = exporter.ExportLLMCall(context.Background(), o11y.LLMCallData{
Model: "gpt-4o",
Provider: "openai",
InputTokens: 500,
OutputTokens: 150,
Duration: 1200 * time.Millisecond,
Cost: 0.003,
Response: "The capital of France is Paris.",
Metadata: map[string]any{
"temperature": 0.7,
},
})
if err != nil {
log.Fatal(err)
}
}

Phoenix is commonly run locally for development. Start a Phoenix server and point the exporter at it:

Terminal window
# Start Phoenix (Python)
pip install arize-phoenix
phoenix serve
# Phoenix UI available at http://localhost:6006
exporter, err := phoenix.New(
phoenix.WithBaseURL("http://localhost:6006"),
)

No API key is needed for local Phoenix instances.

For hosted Phoenix deployments, provide an API key:

exporter, err := phoenix.New(
phoenix.WithBaseURL("https://phoenix.example.com"),
phoenix.WithAPIKey(os.Getenv("PHOENIX_API_KEY")),
)

The API key is sent as a bearer token in the Authorization header.

Phoenix uses an OpenTelemetry-compatible span format. Each ExportLLMCall creates a span with:

  • Span kind: "LLM"
  • Trace and span IDs: Generated as random hex strings (32-char trace ID, 16-char span ID)
  • Status: "OK" on success, "ERROR" with a message on failure
  • Attributes: Mapped to OTel GenAI conventions
AttributeDescription
llm.model_nameModel identifier
llm.providerProvider name
llm.token_count.promptInput token count
llm.token_count.completionOutput token count
llm.token_count.totalTotal token count
llm.costEstimated cost
input.valueSerialized input messages
output.valueModel response text
metadata.*Custom metadata fields (prefixed)

Combine Phoenix with other observability providers:

import (
"github.com/lookatitude/beluga-ai/o11y"
"github.com/lookatitude/beluga-ai/o11y/providers/phoenix"
"github.com/lookatitude/beluga-ai/o11y/providers/langsmith"
)
pxExporter, err := phoenix.New(
phoenix.WithBaseURL("http://localhost:6006"),
)
if err != nil {
log.Fatal(err)
}
lsExporter, err := langsmith.New(
langsmith.WithAPIKey(os.Getenv("LANGSMITH_API_KEY")),
)
if err != nil {
log.Fatal(err)
}
multi := o11y.NewMultiExporter(pxExporter, lsExporter)
err = exporter.ExportLLMCall(ctx, data)
if err != nil {
// Error format: "phoenix: export trace: <underlying error>"
log.Printf("Phoenix export failed: %v", err)
}

The Flush method is a no-op since the provider sends data synchronously via HTTP.