Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the OneLake & Platform Admin teams for an ask US anything on July 16th. Join now.

Reply
chandudreamz
New Member

How to trigger Power BI refresh from Databricks pipeline without keeping cluster alive

I have a Databricks pipeline that pulls data from AWS, which takes ~90 minutes. After this, I need to refresh a series of Power BI dataflows (45 mins) and then datasets (45 mins).

I want to trigger the Power BI refresh automatically from Databricks once the pipeline finishes. However, if I run this as a final task in the Databricks job, the cluster has to stay alive for up to 90 extra minutes while Power BI refreshes — which wastes compute and costs.

I want to avoid scheduled refreshes in Power BI since they don’t align well with my pipeline timing.

 Is there a recommended architecture to:

Trigger Power BI refreshes programmatically from Databricks

Let the refresh continue independently after triggering

Avoid cluster runtime during Power BI refresh wait time?

2 REPLIES 2
v-dineshya
Community Support
Community Support

Hi @chandudreamz ,

Thank you for reaching out to the Microsoft Community Forum.

 

You are looking to trigger Power BI refreshes programmatically from a Databricks pipeline without keeping the cluster alive during the refresh wait time.

 

The best approach is to use Power Automate or a serverless function like Azure Function to trigger the Power BI refresh after the Databricks pipeline completes.


Please follow below steps to trigger Power BI Refresh automatically.

 

1. REST APIs to trigger refreshes for both dataflows and datasets. These APIs can be called from any HTTP-capable service.

 

2. At the end of your Databricks pipeline, instead of keeping the cluster alive, you can Call a webhook activity that triggers a Power Automate flow. Or invoke an Azure Function that calls the Power BI REST API.

 

Note: This decouples the refresh process from the Databricks job, allowing the cluster to shut down immediately after the pipeline finishes.


Power Automate Integration:

 

Power Automate flow can be configured to Wait for a signal (webhook or status file in storage). It trigger Power BI refresh using the REST API. Optionally notify stakeholders or log the refresh status. This method is reliable way to refresh dataflows after a Databricks job.

 

Azure Function:

 

These serverless options can be triggered by a Databricks webhook, a file drop in blob/S3, a message in a queue (Azure Service Bus or AWS SQS). They execute independently and can call the Power BI REST API without requiring the Databricks cluster to remain active.

 

Note: Using event-driven triggers ensures refreshes happen exactly when needed.

 

Please refer Microsoft official document and community thread.

Trigger dataflows and Power BI semantic models sequentially - Power Query | Microsoft Learn

Official Microsoft Power Automate documentation - Power Automate | Microsoft Learn

Solved: Re: Refresh Dataflow after Databricks job - Microsoft Fabric Community

Solved: Triggered Dataflow and Semantic Model Refresh - Microsoft Fabric Community

 

I hope this information helps. Please do let us know if you have any further queries.

 

Regards,

Dinesh

Hi @chandudreamz ,

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.

 

Regards,

Dinesh

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.