Qwen LLM Provider (Alibaba)
The Qwen provider connects Beluga AI to Alibaba’s Qwen family of models via the DashScope API. Qwen exposes an OpenAI-compatible API, so this provider supports all standard features including streaming, tool calling, and structured output.
Choose Qwen when you need strong multilingual support, especially for Chinese and other Asian languages. Qwen models also offer extended context variants (qwen-long) and competitive pricing on the DashScope platform, making them a good choice for multilingual applications and Asian market deployments.
Installation
Section titled “Installation”go get github.com/lookatitude/beluga-ai/llm/providers/qwenConfiguration
Section titled “Configuration”| Field | Required | Default | Description |
|---|---|---|---|
Model | Yes | — | Model ID (e.g. "qwen-plus") |
APIKey | Yes | — | DashScope API key (sk-...) |
BaseURL | No | https://dashscope.aliyuncs.com/compatible-mode/v1 | Override API endpoint |
Timeout | No | 30s | Request timeout |
Environment variables:
| Variable | Maps to |
|---|---|
DASHSCOPE_API_KEY | APIKey |
Basic Usage
Section titled “Basic Usage”package main
import ( "context" "fmt" "log" "os"
"github.com/lookatitude/beluga-ai/config" "github.com/lookatitude/beluga-ai/llm" "github.com/lookatitude/beluga-ai/schema" _ "github.com/lookatitude/beluga-ai/llm/providers/qwen")
func main() { model, err := llm.New("qwen", config.ProviderConfig{ Model: "qwen-plus", APIKey: os.Getenv("DASHSCOPE_API_KEY"), }) if err != nil { log.Fatal(err) }
msgs := []schema.Message{ schema.NewSystemMessage("You are a helpful assistant."), schema.NewHumanMessage("What is the capital of France?"), }
resp, err := model.Generate(context.Background(), msgs) if err != nil { log.Fatal(err) }
fmt.Println(resp.Text())}Streaming
Section titled “Streaming”for chunk, err := range model.Stream(context.Background(), msgs) { if err != nil { log.Fatal(err) } fmt.Print(chunk.Delta)}fmt.Println()Advanced Features
Section titled “Advanced Features”Tool Calling
Section titled “Tool Calling”modelWithTools := model.BindTools(tools)resp, err := modelWithTools.Generate(ctx, msgs, llm.WithToolChoice(llm.ToolChoiceAuto))Structured Output
Section titled “Structured Output”resp, err := model.Generate(ctx, msgs, llm.WithResponseFormat(llm.ResponseFormat{Type: "json_object"}),)Generation Options
Section titled “Generation Options”resp, err := model.Generate(ctx, msgs, llm.WithTemperature(0.7), llm.WithMaxTokens(2048), llm.WithTopP(0.9), llm.WithStopSequences("END"),)Error Handling
Section titled “Error Handling”resp, err := model.Generate(ctx, msgs)if err != nil { log.Fatal(err)}Direct Construction
Section titled “Direct Construction”import "github.com/lookatitude/beluga-ai/llm/providers/qwen"
model, err := qwen.New(config.ProviderConfig{ Model: "qwen-plus", APIKey: os.Getenv("DASHSCOPE_API_KEY"),})Available Models
Section titled “Available Models”| Model ID | Description |
|---|---|
qwen-plus | Balanced cost and performance for general tasks |
qwen-turbo | Fastest, lowest cost per token |
qwen-max | Most capable, best for complex reasoning |
qwen-long | Extended context window for long documents |
Refer to Alibaba Cloud’s documentation for the latest model list.