The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hello there Fabricators,
My aim is to create a report that is being refreshed every day by new data automatically.
I created a report in PBI Desktop which has a Lakehouse in my dev_Gold layer as the data source.
I uploaded this report to OneDrive, and I imported this report into my dev_Gold workspace. This created a Report, a Semantic model and a Dashboard in this dev_Gold workspace.
I deployed the report through deployment pipelines, and in my prod_Gold layer now i have a Report and a Semantic model (as Dashboard is unsupported for Git, it did not get deployed).
In this prod_Gold layer I scheduled a daily refresh for the Semantic model via the Semantic model settings, and configured the gateway that connects to my Lakehouse in the dev_Gold layer.
Every day the Semantic model seems to be refreshed, which is confirmed by the Refreshed column dates next to the Semantic model, and also when I explore the data in the Semantic model.
Then I open my report in the prod_Gold workspace which uses the mentioned Semantic model (which has the refreshed data values) but the Report itself does not show the new data. Refreshing the visuals in the Report does not work.
Only after navigating to the Semantic model in my prod_Gold workspace and refreshing it manually does effect the Report and updates the visuals with the right data.
I had enough of that and thought to create a new report that uses this Semantic model directly as a data source. This report does update with the latest data when opened in PBI Desktop, but when I deploy it in the same way as the initial Report, the same issue arises i.e. the Semantic model has the new data, but not the Report connected directly to this Semantic model.
Here is a visual representation for the workflow that i tried so far:
This issue completely halts my efforts to automate my report refreshes consistently and forces me to refresh the Semantic model manually again after the (succesful) scheduled refresh daily.
Is there anyone who has the same issue, or maybe even a reliable solution?
Also does anyone have a reliable solution for keeping their reports updated, using only the Fabric native tools?
Thank you in advance
Hi @x_mark_x ,
The “Refreshed” timestamp updates even if no data actually changes, or if there's a caching issue. Fabric might be completing a metadata-level refresh, but query results remain cached.
Try this method for full refresh via the Power BI REST API or XMLA endpoint, using refreshType: full
Explicit cache invalidation. You can wrap this in a Power Automate flow or script and trigger it after your pipeline.
Regards,
Chaithra E.
Thank you for your answer @v-echaithra ,
I honestly don´t know yet how to do these tasks you described, so if you have some good documentation or even a tutorial / example, that would be most welcome.
I also wonder what the purpose of the scheduled refresh and Semantic model refresh activity is if none of them can do what they supposed to do... 🙄
Is it possible to use the Get metadata activity do see if the metadata has been changed? If yes how could that be done?
This morning I came to my laptop, and controlled if the pipeline ran and if I have the latest data in the Lakehouse.
The pipeline succeeded (including the Refresh Semantic Model activity) and I had the latest data in my Lakehouse.
When I opened the report (which has the semantic model as the data source) the report showed the old data, so I explored the data in the semantic model which also had the old data in it, even though the pipeline "refreshed" it and the Refreshed column showed the correct date as per the pipeline run refresh.
Then I clicked on the Refresh now button in the Semantic Model and voilá, the Semantic Model and the report both show the latest data.
My bottom line is, that whichever method I choose (be it a Semantic Model Refresh activity in a pipeline, or a scheduled semantic model refresh) to automatically refresh the Semantic Model, even though I see that it has been refreshed on paper, in reality the data is not refreshed.
Manually refreshing the Semantic Model on the other hand usually works, but it sucks to have to refresh all my Semantic Models manually every day if i want to keep my reports up to date.
I don´t think that automatically refreshing a Semantic Model should be so complex or prone to error, so I wonder:
- Is it only me who experiences problems with automatically refreshing the semantic models?
- Am I missing something?
- Is automaticall refreshing a Semantic Model a sort of black magic?
Please let me know if you also face this problem, and please let me know how you do automatic Semantic Model refreshes yourself.
Hi @x_mark_x ,
Thank you for your contibution @Shahid12523 , @rohit1991 .
We wanted to follow up to see if the issue you reported has been fully resolved. If you still have any concerns or need additional support, please don’t hesitate to let us know, we’re here to help.
We truly appreciate your patience and look forward to assisting you further if needed.
Warm regards,
Chaithra E.
Hi @v-echaithra ,
As i mentioned to @rohit1991, i am still conducting my tests and will get back with a feedback on monday.
I want to note though that I had the same problem yesterday, as i described in my original post.
I could confirm that i received the new data into the lakehouse, but this time when i checked the data in my semantic model, it was still showing the old data even after a seemingly successfull scheduled refresh, and of course the same applied to the report as well.
After a manual refresh of the semantic model, my report that i connected to it updated the new data and was working fine, but most interestingly the same semantic model did not show the new data when explored.
To add more context, i read up a little about the semantic model refresh, and ended up appending a semantic model refresh activity to my ETL pipeline instead of using the scheduled refresh directly from the semantic model which i even turned off.
After running my ETL pipeline, i immediately checked my report which was showing the new and updated data. As of now, it seems like using the semantic model refresh activity works instead of the scheduled refresh of the semantic model.
To confirm this statement i still need some time for testing, so i might even close this case later next week when i could test the semantic model refresh activity method more thoroughly.
Thank you for your patience!
Problem: Dataset (semantic model) refresh works, but reports still show cached/old data.
Cause: Report visuals use query cache, which isn’t always cleared after refresh.
Fixes:
Disable query caching in the semantic model settings.
Use DirectLake auto-refresh if Lakehouse is source.
Trigger a cache clear via API/Power Automate right after dataset refresh.
Build reports with live connection to the semantic model (thin reports).
👉 Best Fabric-native solution: DirectLake + disable query caching.
Thank you for the insight @rohit1991.
I am going to test your solution thoroughly this week, and will get back with my findings latest on monday next week.
Hi @x_mark_x
The problem occurs because the report is still connected to the original imported dataset from Desktop, so even though the Semantic Model in the Gold workspace refreshes daily, the visuals do not update. That’s why creating a new report directly on the Semantic Model shows the latest data, while the original report remains stale. The fix is to rebind the report so it points to the Semantic Model in the service:
In Power BI Service, go to Get data <<Power BI datasets.
Select your Semantic Model in the Gold workspace.
Build or republish the report using this Semantic Model.
Once the report is connected this way, it will always reflect the refreshed data from the Semantic Model without requiring manual refresh.
Here’s a simple visual to illustrate: