March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Unfortunately I get some unexpected results with the new semantic model activity. I used it in a simple pipeline with schedule to refresh dataflows, upon success the semantic model is refreshed.
The pipeline completes without errors, the dataflows refresh and have verified that the data there is refreshed. The semantic model _appears_ to have been refreshed as the refresh time is updated and no warnings in the workspace. However, when I open the report running on top of the semantic model, the data actually has not been updated. When I run a manual refresh in the workspace it does update the data.
When looking at the refresh history for the semantic model I see that the 'on demand' and 'scheduled' refreshes took about ~2 minutes, while the 'Via Enhanced Api' type that is triggered by the pipeline activity only takes 6 seconds despite not returning an error.
In the JSON output of the pipeline it also shows each individual table as 'succesful'
I tried building in a 5 minute Wait activity in between, to account for a possible delay of the dataflow actually being populated, but it made no difference.
Could it be I need to be a tenant admin? The documentation says 'tenant account', but to me that just seems any xx@tenant.xx with a PBI license. I'm also the owner of the semantic model btw.
Solved! Go to Solution.
A fix was implemented last friday and I can confirm that it now works as expected. Thank you.
A fix was implemented last friday and I can confirm that it now works as expected. Thank you.
From LinkedIn in a post:
"The fix is going out this week to all tenants. Please keep an eye out for next week that you’re seeing the appropriate refresh state"
Same issue here. I'm the admin of the capacity. Test ont 3 different Semantic Model (import mode), always the same issue.
same here. anyone open a ticket?
Hi, I did. They're looking into it.
I did too, they made me redo most of the things, unsuccesfully. They transfered the request to another team, cause the guy helping me wasn't able to solve the problem.
Any update on this. Not working for me either.
Not yet,
I have ongoing support and sent more information to the team.
Same problem here.
I created a simple PowerBI report with this data for a simple test: Source = DateTime.LocalNow().
When Semantic Model is refreshed manually or scheduled you can see datetime label updated in report but when it is refreshed through pipeline activity it will be marked as refreshed but data it's not.
I'm sure somebody is already working on this so thank you in advance.
Marco
Today my test semantic model is correctly refreshed through pipeline semantic model refresh activity. So are other semantic models with data from various sources. It seems this issue has been solved but I'll keep monitoring to see if it breaks again before using it for real.
Many thanks to anyone working on it.
It's working on our side too.
Thx averyone!
It seems to work for me too.
Thanks!
@MG86 wrote:
When looking at the refresh history for the semantic model I see that the 'on demand' and 'scheduled' refreshes took about ~2 minutes, while the 'Via Enhanced Api' type that is triggered by the pipeline activity only takes 6 seconds despite not returning an error.
I can confirm the exact same behaviour unfortunately!
Hello @MG86 ,
Thanks for using Fabric Community.
nAt this time, we are reaching out to the internal team to get some help on this .
We will update you once we hear back from them.
Hi @Anonymous ,
Any update on this? I saw the post ont the Fabric Blog yesterday about this feature, however, it seems that we are a bunch of people that are not able to use it.
Thx in advance.
Kevin
Hi @MG86 ,
CC: @marco_parravano , @chris__1 , @eivindhaugen
Apologize for the issue you are facing. The best course of action is to open a support ticket and have our support team take a closer look at it.
Please reach out to our support team so they can do a more thorough investigation on why this it is happening: Link
After creating a Support ticket please provide the ticket number as it would help us to track for more information.
Hope this helps. Please let us know if you have any other queries.
Hi @MG86 ,
We haven’t heard from you on the last response and was just checking back to see if you got a chance to create a support ticket.
After creating a Support ticket please provide the ticket number as it would help us to track for more information.
Hi, didn't get around creating a support ticket. Here is the ticket number: 2405020050003928
Hi, I have the same issue, and I've submitted a support ticket: 2404230040008403
I got an error, but not sure if it is the same issue:
First I was able to run it and then I got an error after setting it up to scheduled refresh, with the following error message:
semantic model refresh {"error":{"code":"InvalidRequest","message":"data ID belongs to a shared capacity."}}
The semantic model was import from the Lakehouse, but publised to a workspace which is not attached to a Fabric Capacity. Is this a limitation (I did not find any information).
Publised the model to a Fabric capacity workspace instead and it seems to work.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
7 | |
4 | |
2 | |
2 | |
2 |
User | Count |
---|---|
15 | |
10 | |
7 | |
5 | |
4 |