Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
This connection is implemented through Power Query code as a custom function, without using any complex middleware. Calling it is extremely simple – you just invoke the custom function. We can leverage Deepseek AI's capabilities to read data from Power BI tables and return results. In this example, we receive online customer feedback, and we want to classify feedback into categories such as "Inquiry," "Complaint," "Refund" etc.
let
/*
* Azure DeepSeek API Connector for Power Query
* ---------------------------------------
* A function to interact with Azure DeepSeek's Large Language Model API.
*/
fnDeepSeekChat = (
// Required parameters
systemContent as text, // System instructions for AI behavior
userContent as text, // User's actual query or prompt
// Optional parameters
optional model as text, // Model name to use
optional maxTokens as number, // Controls maximum response length
optional temperature as number // Controls randomness (0-1), lower = more deterministic
) as any =>
let
// Azure API Configuration, replace with your own endpoint
baseUrl = "https://<YOUR OWN URL>.services.ai.azure.com/models/chat/completions",
apiVersion = "2024-05-01-preview",
apiUrl = baseUrl & "?api-version=" & apiVersion,
// API key should be defined externally as AzureDeepSeekAPIKey
apiKey = "<YOUR OWN API KEY>",
// Default settings
DefaultModel = "DeepSeek-V3",
DefaultMaxTokens = 300, // Default token limit
DefaultTemperature = 0.7, // Balanced between creativity and determinism
// Set final parameter values
finalModel = if model <> null then model else DefaultModel,
finalMaxTokens = if maxTokens <> null then maxTokens else DefaultMaxTokens,
finalTemperature = if temperature <> null then temperature else DefaultTemperature,
// Configure request headers - Azure uses api-key
headers = [
#"Content-Type" = "application/json",
#"api-key" = apiKey
],
// Build request body
requestBody = Json.FromValue([
model = finalModel,
messages = {
[role = "system", content = systemContent],
[role = "user", content = userContent]
},
max_tokens = finalMaxTokens,
temperature = finalTemperature,
stream = false
]),
// Execute API request with error handling
// Adding 500ms delay to avoid potential rate limiting
response = Function.InvokeAfter(
() => try Web.Contents(apiUrl, [
Headers = headers,
Content = requestBody,
// Manual handling of error status codes for better diagnostics
ManualStatusHandling = {400, 401, 403, 404, 429, 500}
]) otherwise null,
#duration(0, 0, 0, 0.5)
),
// Parse JSON response
json = if response is null then null else try Json.Document(response) otherwise null,
// Extract response text with error handling
result =
if json <> null and Record.HasFields(json, "choices") and List.Count(json[choices]) > 0 then
try json[choices]{0}[message][content] otherwise "Error: Invalid response format"
else
"Error: No valid response"
in
result
in
fnDeepSeekChat
Now we have this custom function. And how can we use it? I'll introduce two common methods: direct invoking and batch processing in tables.
Direct Invoking for Results: This method is similar to using the deepseek app or ChatGPT app directly. Simply input parameters into the custom function.
systemContent: "You are a customer service classification expert. Categorize the input text into exactly one of the following categories: 'Inquiry', 'Complaint', 'Refund', 'Support', 'Suggestion'"
Batch Processing in Tables: A more powerful approach is applying the function to tables to process multiple records in a batch. This method is particularly suitable for classification and text labeling:
Hi @jianfajun ,
Really cool use case, and thanks for sharing the Power Query function. That’s super helpful for others trying to connect LLM APIs directly from Power BI.
Just a few extra tips that might help:
If you're calling the DeepSeek API frequently or with large payloads, keep an eye on rate limits and token usage. Some models can throttle or return 429 errors if you go over the quota.
It might be a good idea to wrap the Web.Contents call with some basic error handling. You can use try...otherwise in Power Query to catch failed responses and avoid breaking the refresh.
Also, if you're planning to refresh this in the Power BI Service, make sure the API endpoint is accessible from the gateway, and that the gateway is configured to allow web requests.
One more thing — if you're using this for classification or tagging, you could cache the results locally (e.g. in a table or file) to avoid repeated API calls for the same input.
Really nice implementation overall.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
72 | |
69 | |
55 | |
36 | |
31 |
User | Count |
---|---|
87 | |
62 | |
61 | |
49 | |
45 |