Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Vote for your favorite vizzies from the Power BI Dataviz World Championship submissions. Vote now!
Hi Team,
The data has been taken from SQL Server, and my two PBIX file size is (2591219KB,1962302KB) very large. I would like to know how to reduce the file size. I have already removed unused columns. And on the power bi service the Schedule refresh is not working.
Regrads In Advance Fabric Community Team
Solved! Go to Solution.
Another option is to use Direct Query instead of Import. This way the data is stored in the SQL database, not in the cloud.
Another option is to use Direct Query instead of Import. This way the data is stored in the SQL database, not in the cloud.
This data also contains blob photos, which I've converted to base64, and that's only possible with import, not direct query. Please suggest reducing the data size.
composite mode, ie photos in import mode, data in direct query mode
Hi @NehaGoel_2203,
Take a look at this article:
Data reduction techniques for Import modeling - Power BI | Microsoft Learn
If you have already removed unused columns, the next step would be to look at the granularity of your data. Does it need to be that granular for the reporting that you are doing? Can you summarize the data to reduce the number of rows you have?
Proud to be a Super User! | |
@tayloramy thankyou for the help sir. That is a university data and all rows data are important. It can't help to reduce data size
Hi @NehaGoel_2203,
If there is no way to reduce the dataset size, then the last option is to upgrade your capacity.
Proud to be a Super User! | |
how?
Hi @NehaGoel_2203,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @cengizhanarslan, @Tutu_in_YYC and @tayloramy for the prompt response.
Upgrading capacity should be the last option.
Even if all rows are required, large pbix file size is usually because of high cardinality columns, date time precision, auto date/time tables and data types, not just row count. So before upgrading, please try DirectQuery or optimize the model properly (cardinality, date time columns, data types, incremental refresh).
If the dataset is still large after proper modelling, then consider PPU or Fabric capacity.
Thanks and regards,
Anjan Kumar Chippa
Column removal helps, but row count matters much more.
Do this in SQL, not in Power BI:
Filter by date (e.g. last 2–3 years instead of full history)
Aggregate in SQL if you don’t need transaction-level detail
Avoid SELECT *
Certain columns explode memory usage even if they look harmless:
Replace text keys with integer surrogate keys
Split datetime into Date (keep) and Time (remove if not needed)
Move long text columns to a detail table or remove them entirely
Auto Date/Time silently creates hidden date tables per column, increasing size.
Power BI Desktop → Options → Data Load
Uncheck Auto Date/Time
Then refresh and save again.
Often overlooked:
Unused tables still consume memory
Hidden columns still consume memory
Complex visuals don’t affect PBIX size much, but calculated columns do
Replace calculated columns with measures
Vote for your favorite vizzies from the Power BI World Championship submissions!
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 52 | |
| 42 | |
| 38 | |
| 25 | |
| 25 |