Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi Experts,
I have a situation with my Power BI report:
My requirement:
I need to bring this large dataset into Power BI efficiently so that I can work with historical data without exceeding file size limits.
Question:
Thanks in advance for your help!
Regards,
SBC
Solved! Go to Solution.
Don’t import full 1GB into Desktop — use Incremental Refresh with RangeStart/RangeEnd → Desktop loads only a slice, Service handles partitions.
Use Dataflows as a staging layer for heavy tables.
Keep recent data in detail, aggregate old data.
Optimize model (drop unused columns, use proper types).
If still too big → use Composite Model (Import recent + DirectQuery old).
Best option: Incremental Refresh + model optimization.
Don’t import full 1GB into Desktop — use Incremental Refresh with RangeStart/RangeEnd → Desktop loads only a slice, Service handles partitions.
Use Dataflows as a staging layer for heavy tables.
Keep recent data in detail, aggregate old data.
Optimize model (drop unused columns, use proper types).
If still too big → use Composite Model (Import recent + DirectQuery old).
Best option: Incremental Refresh + model optimization.
Hi @SBC,
Thank you @Shahid12523 for the response.
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by Shahid for the issue worked? Is your issue resolved? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi @SBC,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @audreygerred for the prompt response.
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi @v-achippa ,
Thank you for the follow-up.
The previous response didn’t fully address my expectations. I’m reviewing alternative approaches and best practices within our Power BI Pro license constraints. I’m also monitoring the community for any strong suggestions that align with the requirement.
Thanks,
SBC
Hello! Power BI typically handles large data great and the compression in import mode is amazing. What Fabric SKU do you have - that determines your size limitations. If you are just using pro, your model cannot exceed 1 GB. F64 is 25GB per model, F128 is 50GB per model, F256 is 100GB per model.
Proud to be a Super User! | |
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 19 | |
| 9 | |
| 8 | |
| 7 | |
| 6 |