Skip to content
Docs

xAI Grok LLM Provider

The xAI provider connects Beluga AI to xAI’s Grok family of models. xAI exposes an OpenAI-compatible API, so this provider supports all standard features including streaming, tool calling, and structured output.

Choose xAI when you want access to Grok models, which offer strong reasoning and conversational capabilities. Grok-3 is competitive with frontier models from other providers and provides large context windows for extended conversations.

Terminal window
go get github.com/lookatitude/beluga-ai/llm/providers/xai
FieldRequiredDefaultDescription
ModelNo"grok-3"Model ID
APIKeyYesxAI API key (xai-...)
BaseURLNohttps://api.x.ai/v1Override API endpoint
TimeoutNo30sRequest timeout

Environment variables:

VariableMaps to
XAI_API_KEYAPIKey
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/lookatitude/beluga-ai/config"
"github.com/lookatitude/beluga-ai/llm"
"github.com/lookatitude/beluga-ai/schema"
_ "github.com/lookatitude/beluga-ai/llm/providers/xai"
)
func main() {
model, err := llm.New("xai", config.ProviderConfig{
Model: "grok-3",
APIKey: os.Getenv("XAI_API_KEY"),
})
if err != nil {
log.Fatal(err)
}
msgs := []schema.Message{
schema.NewSystemMessage("You are a helpful assistant."),
schema.NewHumanMessage("What is the capital of France?"),
}
resp, err := model.Generate(context.Background(), msgs)
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Text())
}
for chunk, err := range model.Stream(context.Background(), msgs) {
if err != nil {
log.Fatal(err)
}
fmt.Print(chunk.Delta)
}
fmt.Println()
modelWithTools := model.BindTools(tools)
resp, err := modelWithTools.Generate(ctx, msgs, llm.WithToolChoice(llm.ToolChoiceAuto))
resp, err := model.Generate(ctx, msgs,
llm.WithResponseFormat(llm.ResponseFormat{Type: "json_object"}),
)
resp, err := model.Generate(ctx, msgs,
llm.WithTemperature(0.7),
llm.WithMaxTokens(2048),
llm.WithTopP(0.9),
llm.WithStopSequences("END"),
)
resp, err := model.Generate(ctx, msgs)
if err != nil {
log.Fatal(err)
}
import "github.com/lookatitude/beluga-ai/llm/providers/xai"
model, err := xai.New(config.ProviderConfig{
Model: "grok-3",
APIKey: os.Getenv("XAI_API_KEY"),
})
Model IDDescription
grok-3Most capable Grok model, large context
grok-3-miniFast, cost-effective with strong reasoning
grok-2Previous generation, still capable

Refer to xAI’s documentation for the latest model list.