Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowThe Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.
Hey everybody,
so I created a report by querying data from our PostgreSQL server. It's quite a bit of data, so the file size is above the 250 mb limit and I can't upload it to the online dashboard.
Is there any way to remove the data from the pbix file in order to be able to upload it and then let the data refresh only online? Otherwise this would mean that PowerBI could not be used for larger data sets.
Best
Seb
Solved! Go to Solution.
Have you tried using the direct connect instead of the "import all"? 😉
Perhaps make sense. 250mb of a file is not really scalable. You will hit that limit in the long-run anyway.
I would recommend:
a) using live connect with on prem sql
b) using live connect with azure sql db
c) using an AS tabular model on top of sql db
Do you need to load every single column?
You can (should) remove columns that you are not using to reduce the size of the data in the model.
This can be done through the Queries part (a.k.a. Power Query) and it can be done from the data model too (a.k.a. Power Pivot).
Regards
Sacha
Hey,
I removed all the columns I don't need. But there are more then 12 Million rows. And I need all of them. No posibility to scale down the data model any more.
There must be a different solution.
Have you just loaded it as one big table? i.e. is it currently a denormalised table containing duplication within columns?
Is there an opportunity for you to convert it to a star schema and thus remove some data duplication?
There is potential to drastically reduce the file size that way.
Sacha
Alread had it in star schema. No duplicates 😉
If you really need all the data and can't reduce/trim/dedupe/pre-agg then you are likely to need something like SSAS to store the data and use a live connection until/if they increase the limit.
SSAS might offer some addtional benefits to your scenario too.
[Update: Have you looked at the data types you are using in the model? Can you use a more efficient data types anywhere?]
Sacha
Have you tried using the direct connect instead of the "import all"? 😉
Perhaps make sense. 250mb of a file is not really scalable. You will hit that limit in the long-run anyway.
I would recommend:
a) using live connect with on prem sql
b) using live connect with azure sql db
c) using an AS tabular model on top of sql db
Besides optimizing the pbix file i don't know anyway to make a 250+mb datamodel fit in the 250mb data limit.
Only by putting the data in SSAS, which is most of the time no option. Or run it on-premisis when running on SSRS is possible.
I hope they will upgrade this soon, or make it possible to pay for an upgrade.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the February 2025 Power BI update to learn about new features.
User | Count |
---|---|
27 | |
26 | |
22 | |
12 | |
10 |
User | Count |
---|---|
27 | |
25 | |
22 | |
18 | |
13 |