The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event! Join us in Stockholm, Sweden from September 24-27, 2024.
2-for-1 sale on June 20 only!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hey everybody,
so I created a report by querying data from our PostgreSQL server. It's quite a bit of data, so the file size is above the 250 mb limit and I can't upload it to the online dashboard.
Is there any way to remove the data from the pbix file in order to be able to upload it and then let the data refresh only online? Otherwise this would mean that PowerBI could not be used for larger data sets.
Best
Seb
Solved! Go to Solution.
Have you tried using the direct connect instead of the "import all"? 😉
Perhaps make sense. 250mb of a file is not really scalable. You will hit that limit in the long-run anyway.
I would recommend:
a) using live connect with on prem sql
b) using live connect with azure sql db
c) using an AS tabular model on top of sql db
Do you need to load every single column?
You can (should) remove columns that you are not using to reduce the size of the data in the model.
This can be done through the Queries part (a.k.a. Power Query) and it can be done from the data model too (a.k.a. Power Pivot).
Regards
Sacha
Hey,
I removed all the columns I don't need. But there are more then 12 Million rows. And I need all of them. No posibility to scale down the data model any more.
There must be a different solution.
Have you just loaded it as one big table? i.e. is it currently a denormalised table containing duplication within columns?
Is there an opportunity for you to convert it to a star schema and thus remove some data duplication?
There is potential to drastically reduce the file size that way.
Sacha
Alread had it in star schema. No duplicates 😉
If you really need all the data and can't reduce/trim/dedupe/pre-agg then you are likely to need something like SSAS to store the data and use a live connection until/if they increase the limit.
SSAS might offer some addtional benefits to your scenario too.
[Update: Have you looked at the data types you are using in the model? Can you use a more efficient data types anywhere?]
Sacha
Have you tried using the direct connect instead of the "import all"? 😉
Perhaps make sense. 250mb of a file is not really scalable. You will hit that limit in the long-run anyway.
I would recommend:
a) using live connect with on prem sql
b) using live connect with azure sql db
c) using an AS tabular model on top of sql db
Besides optimizing the pbix file i don't know anyway to make a 250+mb datamodel fit in the 250mb data limit.
Only by putting the data in SSAS, which is most of the time no option. Or run it on-premisis when running on SSRS is possible.
I hope they will upgrade this soon, or make it possible to pay for an upgrade.
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the June 2024 Power BI update to learn about new features.
User | Count |
---|---|
34 | |
19 | |
18 | |
16 | |
13 |