Skip to content
Docs

Prompt API — Templates, Versioning, Cache

import "github.com/lookatitude/beluga-ai/prompt"

Package prompt provides prompt template management and cache-optimized prompt building for the Beluga AI framework. It supports template rendering with Go’s text/template syntax, versioned template management via pluggable providers, and a builder that orders prompt content for optimal LLM cache hits.

Template represents a versioned prompt template with Go text/template syntax. Templates define default variable values and carry arbitrary metadata. Use Validate to check that the template content is parseable, and Render to execute the template with provided variables.

The PromptManager interface provides versioned access to prompt templates:

  • Get retrieves a template by name and version.
  • Render retrieves, renders, and returns a template as schema.Message values.
  • List returns summary information for all available templates.

Implementations include the filesystem-based provider in prompt/providers/file.

Builder constructs a prompt message sequence in cache-optimal order. LLM prompt caching works best when static content appears first, so Builder enforces this slot ordering:

  1. System prompt (most static)
  2. Tool definitions (semi-static)
  3. Static context documents (semi-static)
  4. Cache breakpoint marker (explicit cache boundary)
  5. Dynamic context messages (per-session)
  6. User input (always changes)

Template rendering:

tmpl := &prompt.Template{
Name: "greeting",
Version: "1.0.0",
Content: "Hello, {{.name}}! Welcome to {{.system}}.",
Variables: map[string]string{"system": "Beluga"},
}
result, err := tmpl.Render(map[string]any{"name": "Alice"})
if err != nil {
log.Fatal(err)
}

Cache-optimized prompt building:

msgs := prompt.NewBuilder(
prompt.WithSystemPrompt("You are a helpful assistant."),
prompt.WithStaticContext([]string{"Reference: ..."}),
prompt.WithCacheBreakpoint(),
prompt.WithDynamicContext(history),
prompt.WithUserInput(schema.NewHumanMessage("Hello")),
).Build()

import "github.com/lookatitude/beluga-ai/prompt/providers/file"

Package file provides a filesystem-based PromptManager that loads prompt templates from a directory of JSON files. Each JSON file represents a single template version with name, version, content, and variables fields.

FileManager implements prompt.PromptManager by loading templates from JSON files in a directory. Files must have a .json extension and contain a valid prompt.Template structure. All files are parsed on creation.

Templates are organized by name, with multiple versions supported per name. When retrieving a template without specifying a version, the latest version (lexicographically highest) is returned.

Each JSON file should contain:

{
"name": "greeting",
"version": "1.0.0",
"content": "Hello, {{.name}}!",
"variables": {"name": "World"}
}
mgr, err := file.NewFileManager("/path/to/prompts")
if err != nil {
log.Fatal(err)
}
// Get a specific template version
tmpl, err := mgr.Get("greeting", "1.0.0")
if err != nil {
log.Fatal(err)
}
// Render the latest version with variables
msgs, err := mgr.Render("greeting", map[string]any{"name": "Alice"})
if err != nil {
log.Fatal(err)
}
// List all available templates
infos := mgr.List()