Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi,
I wanted to know what is the best practice for triggering a refresh of a power bi dataset once all the precursors have finished doing their jobs?
For example, if I have a scheduled refresh set at 5pm but the database (which comes before the power bi dataset) hasn't finished refreshing by 5, then we run into a problem.
So what is the solution to this problem?
Thanks,
Hello @mp390988,
Hope everything’s going great with you. Just checking in has the issue been resolved or are you still running into problems? Sharing an update can really help others facing the same thing.
Thank you.
Hello @mp390988,
We hope you're doing well. Could you please confirm whether your issue has been resolved or if you're still facing challenges? Your update will be valuable to the community and may assist others with similar concerns.
Thank you.
thank you all for your answers! Much appreciated.
At my company, they have decided to go with Snowflake instead of Fabric, so does anyone know if there is a functionality in Snowflake that will refresh the semantic model at the end of the pipeline?
Thanks
Hi @mp390988,
Thank you for posting your query in the Microsoft Fabric Community Forum.
While Snowflake itself doesn’t directly trigger a Power BI semantic model refresh, you can still follow the same event-based approach by calling the Power BI REST API at the end of your Snowflake pipeline. Most Snowflake environments use tools like Snowflake Tasks, dbt Cloud, Airflow, Azure Data Factory or even a simple webhook to Power Automate to run a final step that triggers the dataset refresh once the data load completes.
This ensures your semantic model always refreshes only after Snowflake has finished processing.
Thanks, @Zanqueta, @cengizhanarslan,@FBergamaschi & @burakkaragoz for sharing valuable insights.
Best regards,
Ganesh Singamshetty.
Hi @mp390988,
Let me add my contribution in this discussion:
If this response was helpful in any way, I’d gladly accept a 👍much like the joy of seeing a DAX measure work first time without needing another FILTER.
Please mark it as the correct solution. It helps other community members find their way faster (and saves them from another endless loop 🌀.
The best practice is not to rely on a fixed refresh schedule when your data has upstream dependencies.
Instead, trigger the Power BI dataset (semantic model) refresh only after all upstream jobs complete successfully. In Fabric, this is done by adding a Semantic model refresh step at the end of your pipeline. The pipeline should first run the database refresh, dataflows, or notebooks, and only once those finish without errors should it trigger the dataset refresh.
This approach ensures Power BI always refreshes against fully prepared data, avoids timing issues, and removes the need for fragile time-based scheduling.
Hi @mp390988 ,
@FBergamaschi is completely right: "Event-Based" triggers are the best practice, whereas "Time-Based" schedules (like your 5 PM refresh) are prone to the exact race condition you described.
However, running a PowerShell script is just one way to execute this. Depending on your infrastructure, here are the 3 Industry Standard Patterns to solve this "Dependency Chaining" problem:
1. The Enterprise Method: Azure Data Factory / Synapse Pipelines If you are using a cloud ETL tool (like ADF) to load your database, this is the "Gold Standard."
How: Add a final activity to your pipeline after the "Load Data" activity succeeds.
Activity: Use the native "Power BI Semantic Model Refresh" activity (preview) or a Web Activity calling the Power BI REST API.
Benefit: The dataset refreshes exactly 1 second after the data load finishes. Zero gaps.
2. The On-Premises SQL Method: SQL Server Agent Jobs If your database is an on-prem SQL Server, you are likely using a SQL Agent Job to run the stored procedures that update the data.
How: Add a final Job Step to that existing job.
Action: This step runs the PowerShell script provided by @FBergamaschi.
Benefit: The refresh is tightly coupled to the database update success. If the DB update fails, the refresh never triggers (preventing empty reports).
3. The Low-Code Method: Power Automate If you don't have access to ADF or the SQL Server Agent, Power Automate is the most flexible bridge.
How: Create a Flow with the "Refresh a dataset" action.
Trigger:
Option A: If your database supports it, use a "When a row is inserted" trigger (have your DB write to a 'Log' table when it finishes).
Option B (Webhook): Create the flow with an "When an HTTP request is received" trigger. Have your database script or ETL tool send a simple 'ping' (POST request) to this URL when it finishes.
Summary: Stop guessing the time (5 PM). Make the completion of the database job the trigger for the Power BI job.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.
The best solution is running a Powershell script when the refresh has completed
This script will be launched to trigger the power bi dataset (semantic model) refresh
Here an example i used with a Customer a few months ago
$MailFailureNotify = @{"notifyOption"="MailOnFailure"}
Invoke-PowerBIRestMethod -Url 'groups/808561ec-90ee-436f-b27c-9be5e0cb6bf4/datasets/a02568ca-e73b-4579-9402-ba0dd4e8a268/refreshes' -Body $MailFailureNotify
-Method Post -Verbose
If this helped, please consider giving kudos and mark as a solution
@mein replies or I'll lose your thread
Want to check your DAX skills? Answer my biweekly DAX challenges on the kubisco Linkedin page
Consider voting this Power BI idea
Francesco Bergamaschi
MBA, M.Eng, M.Econ, Professor of BI
Hi @FBergamaschi ,
Thank you for your reply.
I honestly have no idea what this means:
$MailFailureNotify = @{"notifyOption"="MailOnFailure"}
Invoke-PowerBIRestMethod -Url 'groups/808561ec-90ee-436f-b27c-9be5e0cb6bf4/datasets/a02568ca-e73b-4579-9402-ba0dd4e8a268/refreshes' -Body $MailFailureNotify
-Method Post -Verbose
And where do you put this code?
Thank You
Hi @mp390988,
that is PowerShell code.
It is a line command shell where you can run commands like the one I showed you. Install it and then you can run the code but the point is that the thing should be launched only when the precursors have finished. I am not sure on your architecture but in the case of a SQL Server DWH, there is a scheduler that can launch the code.
Anyway I am not a SQL expert so on this I cannot help more
Hope my post is anyway a solution for you, I can assure you this approach works perfectly
PS I forgot to mention that the part in bold needs to be adapted to your tenant urls
$MailFailureNotify = @{"notifyOption"="MailOnFailure"}
Invoke-PowerBIRestMethod -Url 'groups/808561ec-90ee-436f-b27c-9be5e0cb6bf4/datasets/a02568ca-e73b-4579-9402-ba0dd4e8a268/refreshes' -Body $MailFailureNotify
-Method Post -Verbose
here you can find more info
If this helped, please consider giving kudos and mark as a solution
@me in replies or I'll lose your thread
Want to check your DAX skills? Answer my biweekly DAX challenges on the kubisco Linkedin page
Consider voting this Power BI idea
Francesco Bergamaschi
MBA, M.Eng, M.Econ, Professor of BI
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 143 | |
| 123 | |
| 101 | |
| 80 | |
| 54 |