Skip to content
Docs

Providers — 108 Integrations

Beluga AI v2 ships with 108 provider implementations across 15 categories. Every provider follows the same registry pattern: import the package, and it auto-registers via init(). Switch between providers by changing an import path and configuration — no code changes required.

Each provider page documents configuration options, code examples, and guidance on when to choose that provider. All examples use the github.com/lookatitude/beluga-ai module path and handle errors explicitly.

CategoryProvidersInterfaceDescription
LLM22llm.ChatModelOpenAI, Anthropic, Google, Ollama, Bedrock, Groq, and 16 more
Embedding9embedding.EmbedderOpenAI, Cohere, Google, Jina, Mistral, Ollama, Voyage, Sentence Transformers
Vector Store13vectorstore.VectorStorepgvector, Pinecone, Qdrant, Weaviate, Milvus, Elasticsearch, Redis, and more
Voice (STT/TTS)14stt.STT / tts.TTSDeepgram, ElevenLabs, AssemblyAI, Cartesia, Whisper, Groq, and more
Document Loader8loader.DocumentLoaderCloud storage, Confluence, Firecrawl, Google Drive, GitHub, Notion
Guard5guard.GuardAzure Safety, Guardrails AI, Lakera, LLM Guard, NeMo
Evaluation3eval.MetricBraintrust, DeepEval, RAGAS
Observability4o11y.TracerProviderLangfuse, LangSmith, Opik, Phoenix
Workflow6workflow.EngineDapr, In-memory, Inngest, Kafka, NATS, Temporal
Transport3transport.TransportDaily, LiveKit, Pipecat
VAD2vad.DetectorSilero, WebRTC
Cache1cache.StoreIn-memory
State1state.StoreIn-memory
Prompt1prompt.LoaderFile-based
MCP1MCP integrationComposio

Every provider category uses the same pattern:

import (
"github.com/lookatitude/beluga-ai/llm"
_ "github.com/lookatitude/beluga-ai/llm/providers/openai" // auto-registers
_ "github.com/lookatitude/beluga-ai/llm/providers/anthropic" // auto-registers
)
// Create by name
model, err := llm.New("openai", cfg)
// Discover available providers
names := llm.List() // ["anthropic", "openai", ...]

Any provider category can be extended by implementing the interface and calling Register() in init():

func init() {
llm.Register("my-provider", func(cfg config.ProviderConfig) (llm.ChatModel, error) {
return &myModel{apiKey: cfg.APIKey}, nil
})
}

See the API Reference for complete interface definitions, or browse individual provider pages for configuration details and usage examples.