Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Hello,
We have a large and complex semantic model which runs a report and a few dashboards.
We recently looked into purchasing Fabric as a replacement to the current "shared capacity" using power BI pro licenses only.
After testing with Fabric we learned that we are unable to refresh the semantic model at all.
This is the semantic model's info from the vertipaq analyser.
VertiPaq Analyzer Summary | |||
Total Size | 886483293 | ||
Tables | 271 | ||
Partitions | 271 | ||
Segments | 271 | ||
Columns | 8765 | ||
Measures | 320 | ||
Calc tables | 6 | ||
Calc columns | 97 | ||
Roles | 383 |
I am curious about anyone elses experience with Fabric and if anyone out there has experienced failed refreshes for multiple days in a shared capacity?
After testing on an F64 SKU, I was able to refresh and users were able to view it, however nothing was happening on the F32 or lower SKUs.
Regards
Solved! Go to Solution.
In the service The size of the semantic model could be detected slightly higher than in Vertipaq Analyzer.
To answer your Question about 500 MB..., yes optimizing your semantic model could be a good idea to lower the size of it since the refresh was running without any problems before.
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
@aj1973 has provided as much information possible, that microsoft has made available, so thanks for your responses.
This is not marked as a solution by me, because Microsoft have only stated that models over 1gb are unable to refresh in F32 or less. NOWHERE IS IT STATED THAT A MODEL OF SIZE 880MB IS PRONE TO FAILURE.
The 880MB model was refreshing fine without fabric. It currently is refreshing 8 times a day in a shared capacity.
After removing as many unnecessary elements from my data set, I reduced it to 525MB and it made no change in the F32 SKU
Microsoft can have an outage and not provide anyone with any further information, which is rubbish because of the reliance companies place on this product. I am currently seeking alternatives as the lack of information is rubbish and 5000-10,000 USD a month isn't "an easy fix".
Providing more transparency on the limits of power BI service in a shared capacity would be much more helpful than marking posts as solved, when no real resolution was achieved
Hi @srajapinn
The size of the semantic model is over 1G therefore it needs premium capacity to run the refresh. By default F64 or higher is for premium capacity workspaces that's why the refresh worked for F64 but not for F32
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
Hi @aj1973 ,
We have had this model refreshing in the shared capacity for almost a year with only a few days of downtime.
I am able to refresh this in the premium capacity, as the model size according to vertipaq is 886483293 bytes or roughly 880 MB. Would my model need to be 500MB in order to refresh in F32? Because when F64 is active, this isn't taking up literally half of my CU's while refreshing.
Many thanks
Sajan
In the service The size of the semantic model could be detected slightly higher than in Vertipaq Analyzer.
To answer your Question about 500 MB..., yes optimizing your semantic model could be a good idea to lower the size of it since the refresh was running without any problems before.
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
Check out the October 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.