Google Gemini LLM Provider
The Google provider connects Beluga AI to Google’s Gemini family of models using the official google.golang.org/genai SDK. It supports chat completions, streaming, tool calling, vision, and system instructions.
Choose Google Gemini when you need long context windows (up to 1M+ tokens), strong multimodal capabilities across text, images, and video, or integration with Google Cloud services. Gemini Flash models offer a strong balance of speed and quality for cost-sensitive workloads.
Installation
Section titled “Installation”go get github.com/lookatitude/beluga-ai/llm/providers/googleConfiguration
Section titled “Configuration”| Field | Required | Default | Description |
|---|---|---|---|
Model | Yes | — | Model ID (e.g. "gemini-2.5-flash") |
APIKey | Yes | — | Google AI API key |
BaseURL | No | Gemini API default | Override API endpoint |
Timeout | No | 30s | Request timeout |
Environment variables:
| Variable | Maps to |
|---|---|
GOOGLE_API_KEY | APIKey |
GOOGLE_GENAI_API_KEY | APIKey |
Basic Usage
Section titled “Basic Usage”package main
import ( "context" "fmt" "log" "os"
"github.com/lookatitude/beluga-ai/config" "github.com/lookatitude/beluga-ai/llm" "github.com/lookatitude/beluga-ai/schema" _ "github.com/lookatitude/beluga-ai/llm/providers/google")
func main() { model, err := llm.New("google", config.ProviderConfig{ Model: "gemini-2.5-flash", APIKey: os.Getenv("GOOGLE_API_KEY"), }) if err != nil { log.Fatal(err) }
msgs := []schema.Message{ schema.NewSystemMessage("You are a helpful assistant."), schema.NewHumanMessage("What is the capital of France?"), }
resp, err := model.Generate(context.Background(), msgs) if err != nil { log.Fatal(err) }
fmt.Println(resp.Text())}Streaming
Section titled “Streaming”for chunk, err := range model.Stream(context.Background(), msgs) { if err != nil { log.Fatal(err) } fmt.Print(chunk.Delta)}fmt.Println()Advanced Features
Section titled “Advanced Features”Tool Calling
Section titled “Tool Calling”tools := []schema.ToolDefinition{ { Name: "get_weather", Description: "Get current weather for a location", InputSchema: map[string]any{ "type": "object", "properties": map[string]any{ "location": map[string]any{ "type": "string", "description": "City name", }, }, "required": []any{"location"}, }, },}
modelWithTools := model.BindTools(tools)resp, err := modelWithTools.Generate(ctx, msgs, llm.WithToolChoice(llm.ToolChoiceAuto))if err != nil { log.Fatal(err)}
for _, tc := range resp.ToolCalls { fmt.Printf("Tool: %s, Args: %s\n", tc.Name, tc.Arguments)}Gemini supports the following tool choice modes:
| Beluga ToolChoice | Gemini Equivalent |
|---|---|
llm.ToolChoiceAuto | AUTO |
llm.ToolChoiceNone | NONE |
llm.ToolChoiceRequired | ANY |
llm.WithSpecificTool() | ANY + allowed function names |
Vision (Multimodal)
Section titled “Vision (Multimodal)”msgs := []schema.Message{ schema.NewHumanMessageWithParts( schema.TextPart{Text: "Describe this image."}, schema.ImagePart{ Data: imageBytes, MimeType: "image/png", }, ),}
resp, err := model.Generate(ctx, msgs)File URIs are also supported for images stored in Google Cloud:
schema.ImagePart{URL: "gs://bucket/image.png"}System Instructions
Section titled “System Instructions”System messages are automatically mapped to Gemini’s SystemInstruction parameter:
msgs := []schema.Message{ schema.NewSystemMessage("You are a code reviewer."), schema.NewHumanMessage("Review this function..."),}Generation Options
Section titled “Generation Options”resp, err := model.Generate(ctx, msgs, llm.WithTemperature(0.7), llm.WithMaxTokens(4096), llm.WithTopP(0.9), llm.WithStopSequences("END"),)Error Handling
Section titled “Error Handling”resp, err := model.Generate(ctx, msgs)if err != nil { // Errors are wrapped with the "google:" prefix log.Fatal(err)}Token usage is available on the response:
fmt.Printf("Input: %d, Output: %d, Cached: %d\n", resp.Usage.InputTokens, resp.Usage.OutputTokens, resp.Usage.CachedTokens,)Direct Construction
Section titled “Direct Construction”import "github.com/lookatitude/beluga-ai/llm/providers/google"
model, err := google.New(config.ProviderConfig{ Model: "gemini-2.5-flash", APIKey: os.Getenv("GOOGLE_API_KEY"),})For testing with a custom HTTP client:
model, err := google.NewWithHTTPClient(cfg, httpClient)Available Models
Section titled “Available Models”| Model ID | Description |
|---|---|
gemini-2.5-pro | Most capable Gemini model |
gemini-2.5-flash | Fast, balanced model |
gemini-2.0-flash | Previous generation fast model |
Refer to Google AI’s model documentation for the latest model list.