Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
mp390988
Post Patron
Post Patron

Refreshing dataset

Hi,

 

I wanted to know what is the best practice for triggering a refresh of a power bi dataset once all the precursors have finished doing their jobs?

 

For example, if I have a scheduled refresh set at 5pm but the database (which comes before the power bi dataset) hasn't finished refreshing by 5, then we run into a problem. 

 

So what is the solution to this problem?

 

Thanks,

10 REPLIES 10
v-ssriganesh
Community Support
Community Support

Hello @mp390988,

Hope everything’s going great with you. Just checking in has the issue been resolved or are you still running into problems? Sharing an update can really help others facing the same thing.

Thank you.

v-ssriganesh
Community Support
Community Support

Hello @mp390988,

We hope you're doing well. Could you please confirm whether your issue has been resolved or if you're still facing challenges? Your update will be valuable to the community and may assist others with similar concerns.

Thank you.

mp390988
Post Patron
Post Patron

thank you all for your answers! Much appreciated.

At my company, they have decided to go with Snowflake instead of Fabric, so does anyone know if there is a functionality in Snowflake that will refresh the semantic model at the end of the pipeline?

 

Thanks

Hi @mp390988,
Thank you for posting your query in the Microsoft Fabric Community Forum
.

While Snowflake itself doesn’t directly trigger a Power BI semantic model refresh, you can still follow the same event-based approach by calling the Power BI REST API at the end of your Snowflake pipeline. Most Snowflake environments use tools like Snowflake Tasks, dbt Cloud, Airflow, Azure Data Factory or even a simple webhook to Power Automate to run a final step that triggers the dataset refresh once the data load completes.

This ensures your semantic model always refreshes only after Snowflake has finished processing.

Thanks, @Zanqueta@cengizhanarslan,@FBergamaschi & @burakkaragoz for sharing valuable insights.

Best regards,
Ganesh Singamshetty.

Zanqueta
Solution Sage
Solution Sage

Hi @mp390988

 

Let me add my contribution in this discussion:

 

Best Practices to Solve This

Option 1: Use Power Automate or Fabric Dataflows

  • Instead of relying on a fixed schedule, trigger the Power BI dataset refresh after the upstream process completes.
  • For example:
    • When your ETL or database job finishes, send a webhook or event to Power Automate.
    • Power Automate then calls the Power BI REST API to refresh the dataset.
  • This ensures the refresh happens only when the data is ready.

Option 2: Use Data Pipelines in Microsoft Fabric

  • If you are using Fabric, you can orchestrate the entire process in a Data Pipeline:
    • Step 1: Run your precursor jobs (SQL, Lakehouse, etc.).
    • Step 2: Add a Power BI Refresh activity at the end of the pipeline.
  • This gives full control and avoids timing conflicts.
 

If this response was helpful in any way, I’d gladly accept a 👍much like the joy of seeing a DAX measure work first time without needing another FILTER.

Please mark it as the correct solution. It helps other community members find their way faster (and saves them from another endless loop 🌀.

cengizhanarslan
Continued Contributor
Continued Contributor

The best practice is not to rely on a fixed refresh schedule when your data has upstream dependencies.

 

Instead, trigger the Power BI dataset (semantic model) refresh only after all upstream jobs complete successfully. In Fabric, this is done by adding a Semantic model refresh step at the end of your pipeline. The pipeline should first run the database refresh, dataflows, or notebooks, and only once those finish without errors should it trigger the dataset refresh.

 

This approach ensures Power BI always refreshes against fully prepared data, avoids timing issues, and removes the need for fragile time-based scheduling.

_________________________________________________________
If this helped, ✓ Mark as Solution | Kudos appreciated
Connect on LinkedIn
burakkaragoz
Community Champion
Community Champion

Hi @mp390988 ,

@FBergamaschi  is completely right: "Event-Based" triggers are the best practice, whereas "Time-Based" schedules (like your 5 PM refresh) are prone to the exact race condition you described.

However, running a PowerShell script is just one way to execute this. Depending on your infrastructure, here are the 3 Industry Standard Patterns to solve this "Dependency Chaining" problem:

1. The Enterprise Method: Azure Data Factory / Synapse Pipelines If you are using a cloud ETL tool (like ADF) to load your database, this is the "Gold Standard."

  • How: Add a final activity to your pipeline after the "Load Data" activity succeeds.

  • Activity: Use the native "Power BI Semantic Model Refresh" activity (preview) or a Web Activity calling the Power BI REST API.

  • Benefit: The dataset refreshes exactly 1 second after the data load finishes. Zero gaps.

2. The On-Premises SQL Method: SQL Server Agent Jobs If your database is an on-prem SQL Server, you are likely using a SQL Agent Job to run the stored procedures that update the data.

  • How: Add a final Job Step to that existing job.

  • Action: This step runs the PowerShell script provided by @FBergamaschi.

  • Benefit: The refresh is tightly coupled to the database update success. If the DB update fails, the refresh never triggers (preventing empty reports).

3. The Low-Code Method: Power Automate If you don't have access to ADF or the SQL Server Agent, Power Automate is the most flexible bridge.

  • How: Create a Flow with the "Refresh a dataset" action.

  • Trigger:

    • Option A: If your database supports it, use a "When a row is inserted" trigger (have your DB write to a 'Log' table when it finishes).

    • Option B (Webhook): Create the flow with an "When an HTTP request is received" trigger. Have your database script or ETL tool send a simple 'ping' (POST request) to this URL when it finishes.

Summary: Stop guessing the time (5 PM). Make the completion of the database job the trigger for the Power BI job.


If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.

FBergamaschi
Solution Sage
Solution Sage

The best solution is running a Powershell script when the refresh has completed

 

This script will be launched to trigger the power bi dataset (semantic model) refresh

 

Here an example i used with a Customer a few months ago

 

$MailFailureNotify = @{"notifyOption"="MailOnFailure"}

Invoke-PowerBIRestMethod -Url 'groups/808561ec-90ee-436f-b27c-9be5e0cb6bf4/datasets/a02568ca-e73b-4579-9402-ba0dd4e8a268/refreshes' -Body $MailFailureNotify
-Method Post -Verbose

 

If this helped, please consider giving kudos and mark as a solution

@mein replies or I'll lose your thread

Want to check your DAX skills? Answer my biweekly DAX challenges on the kubisco Linkedin page

Consider voting this Power BI idea

Francesco Bergamaschi

MBA, M.Eng, M.Econ, Professor of BI

Hi @FBergamaschi ,

 

Thank you for your reply.

 

I honestly have no idea what this means:

 

$MailFailureNotify = @{"notifyOption"="MailOnFailure"}

Invoke-PowerBIRestMethod -Url 'groups/808561ec-90ee-436f-b27c-9be5e0cb6bf4/datasets/a02568ca-e73b-4579-9402-ba0dd4e8a268/refreshes' -Body $MailFailureNotify
-Method Post -Verbose

 

And where do you put this code?

 

Thank You

 

 

Hi @mp390988,

that is PowerShell code. 

 

https://learn.microsoft.com/en-us/powershell/scripting/install/install-powershell-on-windows?view=po...

 

It is a line command shell where you can run commands like the one I showed you. Install it and then you can run the code but the point is that the thing should be launched only when the precursors have finished. I am not sure on your architecture but in the case of a SQL Server DWH, there is a scheduler that can launch the code.

 

Anyway I am not a SQL expert so on this I cannot help more

 

Hope my post is anyway a solution for you, I can assure you this approach works perfectly

 

PS I forgot to mention that the part in bold needs to be adapted to your tenant urls

 

$MailFailureNotify = @{"notifyOption"="MailOnFailure"}

Invoke-PowerBIRestMethod -Url 'groups/808561ec-90ee-436f-b27c-9be5e0cb6bf4/datasets/a02568ca-e73b-4579-9402-ba0dd4e8a268/refreshes' -Body $MailFailureNotify
-Method Post -Verbose

 

here you can find more info

https://www.fourmoo.com/2018/06/05/using-the-power-bi-api-with-powershell-scripts-refreshing-your-da...

 

 

If this helped, please consider giving kudos and mark as a solution

@me in replies or I'll lose your thread

Want to check your DAX skills? Answer my biweekly DAX challenges on the kubisco Linkedin page

Consider voting this Power BI idea

Francesco Bergamaschi

MBA, M.Eng, M.Econ, Professor of BI

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors