Orchestration Patterns
Five orchestration patterns — Supervisor, Hierarchical, Scatter-Gather, Router, and Blackboard — plus durable workflows that survive crashes and graph-based DAGs with typed state and conditional edges.
Overview
Building multi-agent systems requires more than individual agent capabilities — it requires coordination patterns that manage how agents communicate, delegate, and combine results. Beluga AI provides five orchestration patterns covering the most common multi-agent topologies, from centralized supervisors to decentralized blackboard architectures.
Beyond runtime orchestration, Beluga includes a durable workflow engine backed by an event log. Workflows survive process crashes, container restarts, and human-in-the-loop delays that may span hours or days. Every state transition is persisted, enabling replay, debugging, and auditing of long-running agent processes.
For complex branching logic, graph orchestration offers DAG-based execution with typed state, reducers, and conditional edges. Inspired by LangGraph's approach but implemented natively in Go, the graph engine supports cycles (with configurable limits), parallel fan-out, and state checkpointing — making it suitable for everything from simple chains to intricate multi-step reasoning flows.
Capabilities
Supervisor
A central supervisor agent receives the task, decomposes it into sub-tasks, delegates each to a specialist agent, and validates the combined results. The supervisor maintains a plan, tracks progress, and can re-delegate on failure. This pattern is best for multi-domain workflows where a single coordinator needs to reason about task decomposition and result synthesis.
Hierarchical
Extends the supervisor pattern into a tree of supervisors, each managing their own sub-teams. A top-level supervisor delegates to mid-level supervisors, which in turn delegate to specialist agents. This pattern scales to large organizations of agents where no single supervisor can manage all specialists directly.
Scatter-Gather
Sends the same or related tasks to multiple agents in parallel, then consolidates their results. A configurable aggregation function combines outputs — for example, selecting the best response, merging partial results, or using majority voting. This pattern is ideal for tasks that benefit from diverse perspectives or redundancy.
Router
Conditional routing directs inputs to different agents or sub-pipelines based on input characteristics. Routing decisions can be rule-based (keyword matching, regex), classifier-based (using a small LLM), or embedding-based (semantic similarity). The router pattern is the building block for intent-based dispatch systems and tiered support workflows.
Blackboard
Agents communicate through a shared state store (the blackboard) rather than direct message passing. Each agent reads from and writes to the blackboard independently, with a conflict-resolver handling concurrent updates. This pattern excels when agents have loosely coupled contributions and the workflow is data-driven rather than control-driven.
Durable Workflows
The durable workflow engine persists every state transition to an event log, ensuring workflows survive crashes, restarts, and long pauses. Human-in-the-loop approval steps can block a workflow for hours or days without losing progress. The engine supports compensating actions (sagas), timeout-based escalation, and full replay for debugging. While Beluga ships its own engine by default, Temporal is available as an alternative provider.
Graph Orchestration
Build execution flows as directed acyclic graphs (with optional cycles) where nodes are processing steps and edges define the flow. Each graph carries typed state with reducers that control how node outputs merge. Conditional edges enable branching based on state, and checkpointing allows resuming from any node. The graph engine supports parallel fan-out, fan-in synchronization, and configurable cycle limits for iterative refinement loops.
Architecture
Full Example
A supervisor orchestration with three specialist agents that collaborate on a research task:
package main
import (
"context"
"fmt"
"log"
"github.com/lookatitude/beluga-ai/agent"
"github.com/lookatitude/beluga-ai/llm"
"github.com/lookatitude/beluga-ai/orchestration"
_ "github.com/lookatitude/beluga-ai/llm/providers/openai"
)
func main() {
ctx := context.Background()
// Create the LLM model
model, err := llm.New("openai", llm.ProviderConfig{
Model: "gpt-4o",
})
if err != nil {
log.Fatal(err)
}
// Define specialist agents
researcher := agent.New("researcher",
agent.WithModel(model),
agent.WithSystemPrompt("You are a research specialist. Gather relevant information and return structured findings."),
)
analyst := agent.New("analyst",
agent.WithModel(model),
agent.WithSystemPrompt("You are a data analyst. Analyze findings and identify key insights and patterns."),
)
writer := agent.New("writer",
agent.WithModel(model),
agent.WithSystemPrompt("You are a technical writer. Synthesize analysis into a clear, well-structured report."),
)
// Create the supervisor orchestration
supervisor := orchestration.NewSupervisor("research-supervisor",
orchestration.WithModel(model),
orchestration.WithAgents(researcher, analyst, writer),
orchestration.WithSystemPrompt(`You are a research supervisor. Given a topic:
1. Delegate research to the researcher agent
2. Send findings to the analyst for pattern identification
3. Have the writer produce the final report
Validate each step before proceeding.`),
orchestration.WithMaxIterations(10),
)
// Run the orchestration
result, err := supervisor.Run(ctx, "Analyze the impact of frame-based voice pipelines on latency")
if err != nil {
log.Fatal(err)
}
fmt.Println(result)
// Streaming alternative — observe each delegation step
for event, err := range supervisor.Stream(ctx, "Compare orchestration patterns for multi-agent systems") {
if err != nil {
log.Fatal(err)
}
switch e := event.(type) {
case orchestration.DelegationEvent:
fmt.Printf("[%s] delegated to %s\n", e.Supervisor, e.Agent)
case orchestration.ResultEvent:
fmt.Printf("[%s] completed: %s\n", e.Agent, e.Summary)
}
}
}