Skip to content
Docs

Qwen LLM Provider (Alibaba)

The Qwen provider connects Beluga AI to Alibaba’s Qwen family of models via the DashScope API. Qwen exposes an OpenAI-compatible API, so this provider supports all standard features including streaming, tool calling, and structured output.

Choose Qwen when you need strong multilingual support, especially for Chinese and other Asian languages. Qwen models also offer extended context variants (qwen-long) and competitive pricing on the DashScope platform, making them a good choice for multilingual applications and Asian market deployments.

Terminal window
go get github.com/lookatitude/beluga-ai/llm/providers/qwen
FieldRequiredDefaultDescription
ModelYesModel ID (e.g. "qwen-plus")
APIKeyYesDashScope API key (sk-...)
BaseURLNohttps://dashscope.aliyuncs.com/compatible-mode/v1Override API endpoint
TimeoutNo30sRequest timeout

Environment variables:

VariableMaps to
DASHSCOPE_API_KEYAPIKey
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/lookatitude/beluga-ai/config"
"github.com/lookatitude/beluga-ai/llm"
"github.com/lookatitude/beluga-ai/schema"
_ "github.com/lookatitude/beluga-ai/llm/providers/qwen"
)
func main() {
model, err := llm.New("qwen", config.ProviderConfig{
Model: "qwen-plus",
APIKey: os.Getenv("DASHSCOPE_API_KEY"),
})
if err != nil {
log.Fatal(err)
}
msgs := []schema.Message{
schema.NewSystemMessage("You are a helpful assistant."),
schema.NewHumanMessage("What is the capital of France?"),
}
resp, err := model.Generate(context.Background(), msgs)
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Text())
}
for chunk, err := range model.Stream(context.Background(), msgs) {
if err != nil {
log.Fatal(err)
}
fmt.Print(chunk.Delta)
}
fmt.Println()
modelWithTools := model.BindTools(tools)
resp, err := modelWithTools.Generate(ctx, msgs, llm.WithToolChoice(llm.ToolChoiceAuto))
resp, err := model.Generate(ctx, msgs,
llm.WithResponseFormat(llm.ResponseFormat{Type: "json_object"}),
)
resp, err := model.Generate(ctx, msgs,
llm.WithTemperature(0.7),
llm.WithMaxTokens(2048),
llm.WithTopP(0.9),
llm.WithStopSequences("END"),
)
resp, err := model.Generate(ctx, msgs)
if err != nil {
log.Fatal(err)
}
import "github.com/lookatitude/beluga-ai/llm/providers/qwen"
model, err := qwen.New(config.ProviderConfig{
Model: "qwen-plus",
APIKey: os.Getenv("DASHSCOPE_API_KEY"),
})
Model IDDescription
qwen-plusBalanced cost and performance for general tasks
qwen-turboFastest, lowest cost per token
qwen-maxMost capable, best for complex reasoning
qwen-longExtended context window for long documents

Refer to Alibaba Cloud’s documentation for the latest model list.