The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi all,
I have this PBIX (around 750mb) that I want to publish to a Pro Workspace (Not premium). After a while I get the notification that the publish is completed. However, the dataset is not being loaded in Power BI service.. The dataset loads infinite or until it reaches an error in Power BI service.
I use to publish this report without problems. Seems something has changed but cannot figure out what.
My assupmtion was that I should be able to upload 1GB to the server but apperently it is not loading it to the online environment anymore.
Any ideas what causes the data not to load and how I can fix this?
Kind regards,
Angelo
Solved! Go to Solution.
It says it right there - total size in memory 1.51 GB. That dataset is too big for your capacity.
Cardinality for dimensions should not exceed 100K. Your Divisions table is likely not a dimension.
Needs more details. Do you have scheduled refresh set up? How many calculated columns and/or calculated tables? Have you evealuated the semantic model metrics with DAX Studio?
Thanks for your reply:
This report serves as a datamodel for multiple reports that are published in Power BI. I load a lot of data and maintain the semantic model here and eventually use this dataset in other reports.
Already evaluated the semantic model in DAX. Only not sure what to learn from this. Only thing I can think of is reducing the amount of data in the datamodel but I actually really need it. I've attached some results
The error occurs when I publish the PBIX to Service. But also when I manually refresh or during scheduled refresh of the dataset in service.
So when publishing from Power BI Desktop it confirms that the report is loaded (after a long time), however the dataset keeps on loading till infinity or an error pops up.
It says it right there - total size in memory 1.51 GB. That dataset is too big for your capacity.
Cardinality for dimensions should not exceed 100K. Your Divisions table is likely not a dimension.
Alright, I thought I had to look at the PBIX size but actually it's the Total size in memory.
Are there other ways to decrease this memory?
You can look for data duplication and normalize some more (create more dimensions). But that will only get you so far, and comes with computational baggage.
Thank you so much. I will take a look again at my datamodel then.
Cheers!