Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hello everybody,
I have this issue with a specific report. It's only 280Mb in desktop, but in Service is 800Mb. It's been getting bigger and bigger in Service. Since I've only a Pro license, once it reaches 1Gb it'll be a problem. Can you tell me please, why is this behaviour?
Desktop:
Service:
Solved! Go to Solution.
Hi @Jhorman_Gomez ,
Thank you for reaching out to the Microsoft Fabric Community forum.
Please try below options to Fix the issue.
1. Remove unnecessary columns and tables. Replace calculated columns with DAX measures . Use numeric types instead of strings.
2. Use DAX Studio and VertiPaq Analyzer to analyze the memory size per column. Identify the largest tables/columns and optimize them.
3. Enable Incremental Refresh, Keeps data size under control by only loading new/changed data. This is useful for large fact tables with historical data.
4. Use query folding wherever possible. If Power Query steps are done locally, they increase the dataset size.
5. After cleaning, delete the dataset from the Service, then republish the optimized PBIX. This avoids any historical bloat in the semantic model.
6. Use Direct query or live connection mode while connecting to data sources instead of import option.
Please refer below community threads and Microsoft official document.
Troubleshoot report performance in Power BI - Power BI | Microsoft Learn
Solved: real size of data model - Microsoft Fabric Community
Solved: Power BI - Microsoft Fabric Community
Solved: How do i reduce size of the PBIX file to improve t... - Microsoft Fabric Community
Solved: How to reduce size of my pbix file (caused by appe... - Microsoft Fabric Community
Solved: How to make PBI file size reduce by 50% ? - Microsoft Fabric Community
Semantic model modes in the Power BI service - Power BI | Microsoft Learn
Query folding guidance in Power BI Desktop - Power BI | Microsoft Learn
Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn
If this information is helpful, please “Accept it as a solution” and give a "kudos" to assist other community members in resolving similar issues more efficiently.
Thank you.
Hi @Jhorman_Gomez ,
Thank you for reaching out to the Microsoft Fabric Community forum.
Please try below options to Fix the issue.
1. Remove unnecessary columns and tables. Replace calculated columns with DAX measures . Use numeric types instead of strings.
2. Use DAX Studio and VertiPaq Analyzer to analyze the memory size per column. Identify the largest tables/columns and optimize them.
3. Enable Incremental Refresh, Keeps data size under control by only loading new/changed data. This is useful for large fact tables with historical data.
4. Use query folding wherever possible. If Power Query steps are done locally, they increase the dataset size.
5. After cleaning, delete the dataset from the Service, then republish the optimized PBIX. This avoids any historical bloat in the semantic model.
6. Use Direct query or live connection mode while connecting to data sources instead of import option.
Please refer below community threads and Microsoft official document.
Troubleshoot report performance in Power BI - Power BI | Microsoft Learn
Solved: real size of data model - Microsoft Fabric Community
Solved: Power BI - Microsoft Fabric Community
Solved: How do i reduce size of the PBIX file to improve t... - Microsoft Fabric Community
Solved: How to reduce size of my pbix file (caused by appe... - Microsoft Fabric Community
Solved: How to make PBI file size reduce by 50% ? - Microsoft Fabric Community
Semantic model modes in the Power BI service - Power BI | Microsoft Learn
Query folding guidance in Power BI Desktop - Power BI | Microsoft Learn
Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn
If this information is helpful, please “Accept it as a solution” and give a "kudos" to assist other community members in resolving similar issues more efficiently.
Thank you.
Hi @Jhorman_Gomez ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.
Thank you.
I hadn't tried deleting the report on service and uploading it again. That worked, thanks.
HI @Jhorman_Gomez ,
Are you using incremental refresh? That can store more data in the service than you would see in the desktop. Or, is the service refresh failing quite often? That might keep some temporary data.
You might want to try out the VertiPaq Analyzer to inspect the size and usage by tables and columns to see if there is specific item that is "out of control".
Proud to be a Datanaut!
Private message me for consulting or training needs.
Hello, Sir. I'm not using incremental refresh. The report does fail on a daily basis because we're forced to query an OLTP database. This db cancels requests longer than 30s to prioritize rows version updates (a WAL thing in the AWS Postgress configuration). I didn't know about any temporal data getting stored in Service. How can I locate this data in order to clean it?
User | Count |
---|---|
48 | |
31 | |
27 | |
26 | |
26 |
User | Count |
---|---|
60 | |
56 | |
36 | |
32 | |
28 |