Everything You Need to Build AI Agents in Go
Beluga AI covers the full stack — from LLM abstraction to production deployment.
Agent Runtime & Reasoning
Build agents with pluggable reasoning — ReAct, Tree-of-Thought, LATS, Reflexion, and more. Handoffs-as-tools enable multi-agent collaboration with zero boilerplate.
ExploreLLM Abstraction & Routing
Unified ChatModel interface across 22+ providers with intelligent routing, structured output, context window management, and provider-aware rate limiting.
ExploreRAG Pipeline
Hybrid retrieval combining dense vectors, BM25, and graph traversal with RRF. Advanced strategies: CRAG, Adaptive RAG, HyDE, GraphRAG.
ExploreVoice Pipeline
Frame-based STT-to-LLM-to-TTS processing with sub-800ms latency. Speech-to-speech, Silero VAD, semantic turn detection.
ExploreOrchestration Patterns
Five orchestration patterns — Supervisor, Hierarchical, Scatter-Gather, Router, and Blackboard — plus durable workflows that survive crashes and human-in-the-loop delays.
ExploreMemory Systems
Three-tier MemGPT memory (Core, Recall, Archival) with 6 strategies: buffer, window, summary, entity, semantic, and graph. Self-editable via agent tools.
ExploreObservability
OpenTelemetry GenAI semantic conventions at every boundary. Token counting, cost tracking, structured logging, health checks.
ExploreTools & MCP
Wrap any Go function as a tool. Discover and consume MCP servers. Expose your tools as MCP services. Parallel DAG execution.
ExploreGuardrails & Safety
Three-stage guard pipeline with prompt injection detection, PII filtering, and capability-based sandboxing with default-deny policies.
ExploreProtocols & Interop
First-class MCP server/client and A2A for agent-to-agent communication. REST, SSE, gRPC, and WebSocket transports.
ExploreLearn one pattern, extend everything
Every package follows the same four-component structure: Extension Interface, Registry + Factory, Lifecycle Hooks, and Middleware Chains. Add new LLM providers, custom planners, vector stores, or voice processors from your own code — zero framework changes required.
Read the extensibility guidefunc init() {
llm.Register("my-provider", func(cfg llm.ProviderConfig) (llm.ChatModel, error) {
return NewMyProvider(cfg), nil
})
}