Skip to content
Docs

AWS Bedrock LLM Provider

The AWS Bedrock provider connects Beluga AI to Amazon Bedrock’s multi-provider model catalog using the AWS SDK v2 Converse API. It supports models from Anthropic, Meta, Mistral, Cohere, Amazon, and others through a unified interface with native AWS authentication.

Choose Bedrock when you need AWS-native integration with IAM roles, VPC endpoints, and CloudTrail auditing. Bedrock consolidates access to models from multiple providers under a single billing and governance layer, which simplifies procurement and compliance in AWS-centric environments.

Terminal window
go get github.com/lookatitude/beluga-ai/llm/providers/bedrock
FieldRequiredDefaultDescription
ModelYesBedrock model ID (e.g. "us.anthropic.claude-sonnet-4-5-20250929-v1:0")
APIKeyNoAWS defaultAWS Access Key ID (optional, uses default credentials if unset)
BaseURLNoAWS defaultOverride Bedrock endpoint
TimeoutNo30sRequest timeout

Provider-specific options (via Options map):

KeyDefaultDescription
region"us-east-1"AWS region
secret_keyAWS Secret Access Key (if using static credentials)

Environment variables (standard AWS SDK):

VariableDescription
AWS_ACCESS_KEY_IDAWS access key
AWS_SECRET_ACCESS_KEYAWS secret key
AWS_REGIONAWS region
AWS_PROFILENamed profile

The provider uses the standard AWS SDK credential chain. If APIKey is set in the config, it creates static credentials using APIKey + secret_key. Otherwise, it falls back to the default credential chain (environment, shared config, IAM role, etc.).

package main
import (
"context"
"fmt"
"log"
"github.com/lookatitude/beluga-ai/config"
"github.com/lookatitude/beluga-ai/llm"
"github.com/lookatitude/beluga-ai/schema"
_ "github.com/lookatitude/beluga-ai/llm/providers/bedrock"
)
func main() {
model, err := llm.New("bedrock", config.ProviderConfig{
Model: "us.anthropic.claude-sonnet-4-5-20250929-v1:0",
Options: map[string]any{
"region": "us-east-1",
},
})
if err != nil {
log.Fatal(err)
}
msgs := []schema.Message{
schema.NewSystemMessage("You are a helpful assistant."),
schema.NewHumanMessage("What is the capital of France?"),
}
resp, err := model.Generate(context.Background(), msgs)
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Text())
}
for chunk, err := range model.Stream(context.Background(), msgs) {
if err != nil {
log.Fatal(err)
}
fmt.Print(chunk.Delta)
}
fmt.Println()

Bedrock streaming uses the Converse Stream API, which provides content block start/delta/stop events, message stop events with finish reason, and usage metadata.

tools := []schema.ToolDefinition{
{
Name: "get_weather",
Description: "Get current weather for a location",
InputSchema: map[string]any{
"type": "object",
"properties": map[string]any{
"location": map[string]any{
"type": "string",
"description": "City name",
},
},
"required": []any{"location"},
},
},
}
modelWithTools := model.BindTools(tools)
resp, err := modelWithTools.Generate(ctx, msgs, llm.WithToolChoice(llm.ToolChoiceAuto))
if err != nil {
log.Fatal(err)
}
for _, tc := range resp.ToolCalls {
fmt.Printf("Tool: %s, Args: %s\n", tc.Name, tc.Arguments)
}

Bedrock tool choice mapping:

Beluga ToolChoiceBedrock Equivalent
llm.ToolChoiceAutoAutoToolChoice
llm.ToolChoiceNoneOmit tool config
llm.ToolChoiceRequiredAnyToolChoice
llm.WithSpecificTool()SpecificToolChoice

For models that support vision (e.g. Claude on Bedrock):

msgs := []schema.Message{
schema.NewHumanMessageWithParts(
schema.TextPart{Text: "Describe this image."},
schema.ImagePart{
Data: imageBytes,
MimeType: "image/png",
},
),
}
resp, err := model.Generate(ctx, msgs)

Supported image formats: PNG, JPEG, GIF, WebP.

resp, err := model.Generate(ctx, msgs,
llm.WithTemperature(0.7),
llm.WithMaxTokens(4096),
llm.WithTopP(0.9),
llm.WithStopSequences("END"),
)
resp, err := model.Generate(ctx, msgs)
if err != nil {
// Errors are wrapped with the "bedrock:" prefix
log.Fatal(err)
}

The response includes Bedrock-specific metadata:

fmt.Printf("Input: %d, Output: %d, Total: %d\n",
resp.Usage.InputTokens,
resp.Usage.OutputTokens,
resp.Usage.TotalTokens,
)
// Stop reason is available in metadata
fmt.Println("Stop reason:", resp.Metadata["stop_reason"])
import "github.com/lookatitude/beluga-ai/llm/providers/bedrock"
model, err := bedrock.New(config.ProviderConfig{
Model: "us.anthropic.claude-sonnet-4-5-20250929-v1:0",
Options: map[string]any{"region": "us-west-2"},
})

For testing with a mock client:

model := bedrock.NewWithClient(mockClient, "test-model")
Model IDProviderDescription
us.anthropic.claude-sonnet-4-5-20250929-v1:0AnthropicClaude Sonnet 4.5
us.anthropic.claude-haiku-3-5-20241022-v1:0AnthropicClaude Haiku 3.5
us.meta.llama3-3-70b-instruct-v1:0MetaLlama 3.3 70B
mistral.mistral-large-2407-v1:0MistralMistral Large
amazon.nova-pro-v1:0AmazonAmazon Nova Pro

Refer to the AWS Bedrock documentation for the full model catalog.