We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hi,
I have a power bi file which is currently 1.8 GB big and I'm trying to bring it down to 1-1.5 GB as data is huge It won;t go below 1 Gb unless we split reports which is difficult
I have one entity which has 2 date columns and one of the date column is taking up 600 MB and other one is having 390 MB so between them they almost a GB
columns are full date time and data is being read from azure data lake file
File itself is 2.5 GB big and datatype is datetime where it maintains till miliseconds in file but in power bi it get's trimmed to second level
is there anyway of optimising storage of this datetime columns?
I do not need hierarchy for this 2 columns but I don;t see way of disabling hierarchy for only selective columns
I tried splitting date and time part but that did not help (does it nee to be done before we load on pbix for optimization?)
I do have quite a lot of calculated columns which I'm trying to move to data lake storage in case that helps a little with compression as I read somewhere caclulated columns have very bad compression rate.
Any other suggestions on what can be optimised?
Hi @dilkushpatel,
Based on my research, you could refer to below link about Power BI performence:
https://docs.microsoft.com/en-us/power-bi/power-bi-reports-performance
And you could also refer to GilbertQ's reply in below issue:
Regards,
Daniel He
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 54 | |
| 37 | |
| 32 | |
| 17 | |
| 15 |
| User | Count |
|---|---|
| 66 | |
| 66 | |
| 39 | |
| 34 | |
| 25 |