Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.

Reply
x_mark_x
Advocate I
Advocate I

Semantic model refresh does not refresh data in report

Hello there Fabricators,

 

My aim is to create a report that is being refreshed every day by new data automatically.

 

I created a report in PBI Desktop which has a Lakehouse in my dev_Gold layer as the data source.

I uploaded this report to OneDrive, and I imported this report into my dev_Gold workspace. This created a Report, a Semantic model and a Dashboard in this dev_Gold workspace. 

I deployed the report through deployment pipelines, and in my prod_Gold layer now i have a Report and a Semantic model (as Dashboard is unsupported for Git, it did not get deployed).

In this prod_Gold layer I scheduled a daily refresh for the Semantic model via the Semantic model settings, and configured the gateway that connects to my Lakehouse in the dev_Gold layer.

Every day the Semantic model seems to be refreshed, which is confirmed by the Refreshed column dates next to the Semantic model, and also when I explore the data in the Semantic model.

 

Then I open my report in the prod_Gold workspace which uses the mentioned Semantic model (which has the refreshed data values) but the Report itself does not show the new data. Refreshing the visuals in the Report does not work.

 

Only after navigating to the Semantic model in my prod_Gold workspace and refreshing it manually does effect the Report and updates the visuals with the right data.

 

I had enough of that and thought to create a new report that uses this Semantic model directly as a data source. This report does update with the latest data when opened in PBI Desktop, but when I deploy it in the same way as the initial Report, the same issue arises i.e. the Semantic model has the new data, but not the Report connected directly to this Semantic model.

 

Here is a visual representation for the workflow that i tried so far:
Report refresh workflow.png

This issue completely halts my efforts to automate my report refreshes consistently and forces me to refresh the Semantic model manually again after the (succesful) scheduled refresh daily.

 

Is there anyone who has the same issue, or maybe even a reliable solution?

Also does anyone have a reliable solution for keeping their reports updated, using only the Fabric native tools?

 

Thank you in advance

10 REPLIES 10
v-echaithra
Community Support
Community Support

Hi @x_mark_x ,

The “Refreshed” timestamp updates even if no data actually changes, or if there's a caching issue. Fabric might be completing a metadata-level refresh, but query results remain cached.

Try this method for full refresh via the Power BI REST API or XMLA endpoint, using refreshType: full
Explicit cache invalidation. You can wrap this in a Power Automate flow or script and trigger it after your pipeline.

Regards,
Chaithra E.

Thank you for your answer @v-echaithra ,

I honestly don´t know yet how to do these tasks you described, so if you have some good documentation or even a tutorial / example, that would be most welcome.

I also wonder what the purpose of the scheduled refresh and Semantic model refresh activity is if none of them can do what they supposed to do... 🙄

Is it possible to use the Get metadata activity do see if the metadata has been changed? If yes how could that be done?

Silence from Microsoft Community support...🙄

 

In case you are wondering, no the issue is not solved, I still need to manually refresh my semantic models every day to get the latest data.

 

So I wonder, how do other Fabricators keep their reports up to date? Does everybody do a manual refresh, or is there a way to keep the reports up to date automatically @v-echaithra , @Shahid12523 , @rohit1991 ?

Hi @x_mark_x ,

You can create Power automate flow to refresh the data.


Power Automate Desktop focuses on automating tasks on a local computer, allowing the creation of flows that interact with desktop applications and web browsers. It provides a visual designer to create automation without coding.

 

Publish a Power BI dataset to the Power BI Service and place it in the workspace that it’s going to be viewed from. Power Automate will need a path to locate the Power BI dataset.
Ensure that the Power Automate account has access to the Power BI Workspace. This usually isn’t a problem if you’ve logged into both services using the same Microsoft account.

 

From the Power Automate Website follow these instructions.
Click on Create
Select Automated Cloud Flow or Scheduled Cloud Flow
Name your Workflow and Search for a Trigger or Event
Click on Next Step
Choose an operation Refresh a Power BI Dataset
Update it with your specific Workspace and Dataset to refresh.
Click Save

 

At this point, your Power Automate Workflow will be active.
If a file is uploaded to the SharePoint folder that the workflow is point towards, it will tell Power BI to begin refreshing the data.
To ensure that your Power Automate refresh is working, you can click on Test in the top right corner. This will tell Power Automate to watch for an updated file to occur and will begin logging the workflow action for you to review.


You can also refer to this Microsoft Blog: Refresh your Power BI dataset using Microsoft Flow | Microsoft Power BI Blog | Microsoft Power BI

Hope this helps.

x_mark_x
Advocate I
Advocate I

This morning I came to my laptop, and controlled if the pipeline ran and if I have the latest data in the Lakehouse.

The pipeline succeeded (including the Refresh Semantic Model activity) and I had the latest data in my Lakehouse.

When I opened the report (which has the semantic model as the data source) the report showed the old data, so I explored the data in the semantic model which also had the old data in it, even though the pipeline "refreshed" it and the Refreshed column showed the correct date as per the pipeline run refresh.

Then I clicked on the Refresh now button in the Semantic Model and voilá, the Semantic Model and the report both show the latest data.

 

My bottom line is, that whichever method I choose (be it a Semantic Model Refresh activity in a pipeline, or a scheduled semantic model refresh) to automatically refresh the Semantic Model, even though I see that it has been refreshed on paper, in reality the data is not refreshed.
Manually refreshing the Semantic Model on the other hand usually works, but it sucks to have to refresh all my Semantic Models manually every day if i want to keep my reports up to date.

 

I don´t think that automatically refreshing a Semantic Model should be so complex or prone to error, so I wonder:
- Is it only me who experiences problems with automatically refreshing the semantic models?

- Am I missing something?

- Is automaticall refreshing a Semantic Model a sort of black magic?

 

Please let me know if you also face this problem, and please let me know how you do automatic Semantic Model refreshes yourself.

v-echaithra
Community Support
Community Support

Hi @x_mark_x ,

Thank you for your contibution @Shahid12523 , @rohit1991 .

We wanted to follow up to see if the issue you reported has been fully resolved. If you still have any concerns or need additional support, please don’t hesitate to let us know, we’re here to help.
We truly appreciate your patience and look forward to assisting you further if needed.

Warm regards,
Chaithra E.

Hi @v-echaithra ,

As i mentioned to @rohit1991, i am still conducting my tests and will get back with a feedback on monday.

 

I want to note though that I had the same problem yesterday, as i described in my original post.

I could confirm that i received the new data into the lakehouse, but this time when i checked the data in my semantic model, it was still showing the old data even after a seemingly successfull scheduled refresh, and of course the same applied to the report as well.

 

After a manual refresh of the semantic model, my report that i connected to it updated the new data and was working fine, but most interestingly the same semantic model did not show the new data when explored.

 

To add more context, i read up a little about the semantic model refresh, and ended up appending a semantic model refresh activity to my ETL pipeline instead of using the scheduled refresh directly from the semantic model which i even turned off.

 

After running my ETL pipeline, i immediately checked my report which was showing the new and updated data. As of now, it seems like using the semantic model refresh activity works instead of the scheduled refresh of the semantic model.

 

To confirm this statement i still need some time for testing, so i might even close this case later next week when i could test the semantic model refresh activity method more thoroughly.

Thank you for your patience!

Shahid12523
Resident Rockstar
Resident Rockstar

Problem: Dataset (semantic model) refresh works, but reports still show cached/old data.

Cause: Report visuals use query cache, which isn’t always cleared after refresh.

Fixes:

Disable query caching in the semantic model settings.

Use DirectLake auto-refresh if Lakehouse is source.

Trigger a cache clear via API/Power Automate right after dataset refresh.

Build reports with live connection to the semantic model (thin reports).

👉 Best Fabric-native solution: DirectLake + disable query caching.

Shahed Shaikh
x_mark_x
Advocate I
Advocate I

Thank you for the insight @rohit1991.

 

I am going to test your solution thoroughly this week, and will get back with my findings latest on monday next week.

rohit1991
Super User
Super User

Hi @x_mark_x 

 

The problem occurs because the report is still connected to the original imported dataset from Desktop, so even though the Semantic Model in the Gold workspace refreshes daily, the visuals do not update. That’s why creating a new report directly on the Semantic Model shows the latest data, while the original report remains stale. The fix is to rebind the report so it points to the Semantic Model in the service:

  1. In Power BI Service, go to Get data <<Power BI datasets.

  2. Select your Semantic Model in the Gold workspace.

  3. Build or republish the report using this Semantic Model.

Once the report is connected this way, it will always reflect the refreshed data from the Semantic Model without requiring manual refresh.

Here’s a simple visual to illustrate:

ChatGPT Image Sep 2, 2025, 01_22_59 PM.png

 


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors