Integrations Overview
Beluga AI is designed to work with the systems you already use. These integration guides cover connecting the framework to LLM providers, vector databases, cloud infrastructure, observability platforms, and communication channels. Each guide includes configuration, authentication, working code examples, and troubleshooting for the specific service.
The guides range from drop-in provider registrations (one import line) to custom loaders and retrievers that you build to match your infrastructure.
Integration Categories
Section titled “Integration Categories”| Category | Guides |
|---|---|
| Agents & Tools | Tool registry, MCP integration, API bridges, testing |
| LLM Providers | OpenAI, Anthropic, Google, Bedrock, and 18 more |
| Embeddings | OpenAI, Cohere, Ollama, Jina, Voyage |
| Data & Storage | Vector stores, document loaders, S3, MongoDB, Redis, Elasticsearch |
| Voice | STT, TTS, S2S, VAD, transport, session management |
| Infrastructure | Kubernetes, HashiCorp Vault, NATS, Auth0 |
| Observability | Langfuse, LangSmith, Datadog, Phoenix, Opik |
| Messaging | Slack, Twilio, webhooks |
| Prompts & Schema | LangChain Hub, filesystem templates, JSON Schema |
| Safety & Compliance | JSON reporting, ethical API filters |
Integration Pattern
Section titled “Integration Pattern”Beluga AI uses a registry-based architecture where providers auto-register via Go’s init() mechanism. This means most integrations follow the same three steps — import, configure, create — regardless of the underlying service:
// 1. Import the provider (auto-registers via init())import _ "github.com/lookatitude/beluga-ai/llm/providers/openai"
// 2. Configurecfg := config.ProviderConfig{ APIKey: os.Getenv("OPENAI_API_KEY"), Model: "gpt-4o",}
// 3. Create and usemodel, err := llm.New("openai", cfg)