Skip to content
Docs

Fireworks AI LLM Provider

The Fireworks AI provider connects Beluga AI to Fireworks’ inference platform, which specializes in fast, cost-effective serving of open-source models. Fireworks exposes an OpenAI-compatible API, so this provider supports all standard features including streaming, tool calling, and structured output.

Choose Fireworks AI when you need fast inference for open-source models with competitive per-token pricing. Fireworks supports custom model deployments and fine-tuned models alongside its hosted catalog, making it suitable for production workloads that require both speed and cost efficiency.

Terminal window
go get github.com/lookatitude/beluga-ai/llm/providers/fireworks
FieldRequiredDefaultDescription
ModelNo"accounts/fireworks/models/llama-v3p1-70b-instruct"Model ID
APIKeyYesFireworks API key (fw_...)
BaseURLNohttps://api.fireworks.ai/inference/v1Override API endpoint
TimeoutNo30sRequest timeout

Environment variables:

VariableMaps to
FIREWORKS_API_KEYAPIKey
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/lookatitude/beluga-ai/config"
"github.com/lookatitude/beluga-ai/llm"
"github.com/lookatitude/beluga-ai/schema"
_ "github.com/lookatitude/beluga-ai/llm/providers/fireworks"
)
func main() {
model, err := llm.New("fireworks", config.ProviderConfig{
Model: "accounts/fireworks/models/llama-v3p1-70b-instruct",
APIKey: os.Getenv("FIREWORKS_API_KEY"),
})
if err != nil {
log.Fatal(err)
}
msgs := []schema.Message{
schema.NewSystemMessage("You are a helpful assistant."),
schema.NewHumanMessage("What is the capital of France?"),
}
resp, err := model.Generate(context.Background(), msgs)
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Text())
}
for chunk, err := range model.Stream(context.Background(), msgs) {
if err != nil {
log.Fatal(err)
}
fmt.Print(chunk.Delta)
}
fmt.Println()
modelWithTools := model.BindTools(tools)
resp, err := modelWithTools.Generate(ctx, msgs, llm.WithToolChoice(llm.ToolChoiceAuto))
resp, err := model.Generate(ctx, msgs,
llm.WithResponseFormat(llm.ResponseFormat{Type: "json_object"}),
)
resp, err := model.Generate(ctx, msgs,
llm.WithTemperature(0.7),
llm.WithMaxTokens(2048),
llm.WithTopP(0.9),
llm.WithStopSequences("END"),
)
resp, err := model.Generate(ctx, msgs)
if err != nil {
log.Fatal(err)
}
import "github.com/lookatitude/beluga-ai/llm/providers/fireworks"
model, err := fireworks.New(config.ProviderConfig{
Model: "accounts/fireworks/models/llama-v3p1-70b-instruct",
APIKey: os.Getenv("FIREWORKS_API_KEY"),
})
Model IDDescription
accounts/fireworks/models/llama-v3p1-70b-instructLlama 3.1 70B
accounts/fireworks/models/llama-v3p1-8b-instructLlama 3.1 8B
accounts/fireworks/models/mixtral-8x7b-instructMixtral 8x7B
accounts/fireworks/models/qwen2p5-72b-instructQwen 2.5 72B

Refer to Fireworks AI’s documentation for the full model catalog.