Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hello there Fabricators,
My aim is to create a report that is being refreshed every day by new data automatically.
I created a report in PBI Desktop which has a Lakehouse in my dev_Gold layer as the data source.
I uploaded this report to OneDrive, and I imported this report into my dev_Gold workspace. This created a Report, a Semantic model and a Dashboard in this dev_Gold workspace.
I deployed the report through deployment pipelines, and in my prod_Gold layer now i have a Report and a Semantic model (as Dashboard is unsupported for Git, it did not get deployed).
In this prod_Gold layer I scheduled a daily refresh for the Semantic model via the Semantic model settings, and configured the gateway that connects to my Lakehouse in the dev_Gold layer.
Every day the Semantic model seems to be refreshed, which is confirmed by the Refreshed column dates next to the Semantic model, and also when I explore the data in the Semantic model.
Then I open my report in the prod_Gold workspace which uses the mentioned Semantic model (which has the refreshed data values) but the Report itself does not show the new data. Refreshing the visuals in the Report does not work.
Only after navigating to the Semantic model in my prod_Gold workspace and refreshing it manually does effect the Report and updates the visuals with the right data.
I had enough of that and thought to create a new report that uses this Semantic model directly as a data source. This report does update with the latest data when opened in PBI Desktop, but when I deploy it in the same way as the initial Report, the same issue arises i.e. the Semantic model has the new data, but not the Report connected directly to this Semantic model.
Here is a visual representation for the workflow that i tried so far:
This issue completely halts my efforts to automate my report refreshes consistently and forces me to refresh the Semantic model manually again after the (succesful) scheduled refresh daily.
Is there anyone who has the same issue, or maybe even a reliable solution?
Also does anyone have a reliable solution for keeping their reports updated, using only the Fabric native tools?
Thank you in advance
Solved! Go to Solution.
I think I may have a solution... I've added a step to hit the endpoint with a simple query before refreshing my model, looks like it might do the trick.
You may want to follow my thread on the subject here: https://community.fabric.microsoft.com/t5/Service/Semantic-Model-connected-to-Lakehouse-SQL-Analytic...
Great, glad to be able to help!
It really shouldn't be this hard, should it?
Indeed @DuncanKing,
Either there should be an option in the refresh activity or scheduled refresh to "warm up" the data source by doing a pre-operation query or similar, or there should be an easy to find and understand documentation about this behaviour with easy to follow solution tutorial.
Hopefully you forum post or maybe this one gets picked up by the community at least...
I haven't had a chance to get near it yet, been super busy with client workshops on other projects!
Problem: Dataset (semantic model) refresh works, but reports still show cached/old data.
Cause: Report visuals use query cache, which isn’t always cleared after refresh.
Fixes:
Disable query caching in the semantic model settings.
Use DirectLake auto-refresh if Lakehouse is source.
Trigger a cache clear via API/Power Automate right after dataset refresh.
Build reports with live connection to the semantic model (thin reports).
👉 Best Fabric-native solution: DirectLake + disable query caching.
Thank you for the insight @rohit1991.
I am going to test your solution thoroughly this week, and will get back with my findings latest on monday next week.
Hi @x_mark_x
The problem occurs because the report is still connected to the original imported dataset from Desktop, so even though the Semantic Model in the Gold workspace refreshes daily, the visuals do not update. That’s why creating a new report directly on the Semantic Model shows the latest data, while the original report remains stale. The fix is to rebind the report so it points to the Semantic Model in the service:
In Power BI Service, go to Get data <<Power BI datasets.
Select your Semantic Model in the Gold workspace.
Build or republish the report using this Semantic Model.
Once the report is connected this way, it will always reflect the refreshed data from the Semantic Model without requiring manual refresh.
Here’s a simple visual to illustrate: