Skip to content
Docs

Perplexity LLM Provider

The Perplexity provider connects Beluga AI to Perplexity’s search-augmented language models. Perplexity models combine LLM reasoning with real-time web search, making them well-suited for tasks requiring up-to-date information. Perplexity exposes an OpenAI-compatible API, so this provider supports all standard features including streaming.

Choose Perplexity when your application needs answers grounded in current web data without building a separate search pipeline. This is particularly useful for research assistants, news analysis, and any task where the model’s training data cutoff is a limitation.

Terminal window
go get github.com/lookatitude/beluga-ai/llm/providers/perplexity
FieldRequiredDefaultDescription
ModelYesModel ID (e.g. "sonar-pro")
APIKeyYesPerplexity API key (pplx-...)
BaseURLNohttps://api.perplexity.aiOverride API endpoint
TimeoutNo30sRequest timeout

Environment variables:

VariableMaps to
PERPLEXITY_API_KEYAPIKey
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/lookatitude/beluga-ai/config"
"github.com/lookatitude/beluga-ai/llm"
"github.com/lookatitude/beluga-ai/schema"
_ "github.com/lookatitude/beluga-ai/llm/providers/perplexity"
)
func main() {
model, err := llm.New("perplexity", config.ProviderConfig{
Model: "sonar-pro",
APIKey: os.Getenv("PERPLEXITY_API_KEY"),
})
if err != nil {
log.Fatal(err)
}
msgs := []schema.Message{
schema.NewSystemMessage("You are a helpful research assistant."),
schema.NewHumanMessage("What are the latest developments in Go 1.24?"),
}
resp, err := model.Generate(context.Background(), msgs)
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Text())
}
for chunk, err := range model.Stream(context.Background(), msgs) {
if err != nil {
log.Fatal(err)
}
fmt.Print(chunk.Delta)
}
fmt.Println()
resp, err := model.Generate(ctx, msgs,
llm.WithTemperature(0.2),
llm.WithMaxTokens(2048),
llm.WithTopP(0.9),
)

Lower temperatures are often preferred for factual search queries.

resp, err := model.Generate(ctx, msgs)
if err != nil {
log.Fatal(err)
}
import "github.com/lookatitude/beluga-ai/llm/providers/perplexity"
model, err := perplexity.New(config.ProviderConfig{
Model: "sonar-pro",
APIKey: os.Getenv("PERPLEXITY_API_KEY"),
})
Model IDDescription
sonar-proAdvanced search-augmented model
sonarStandard search-augmented model

Refer to Perplexity’s documentation for the latest model list.