Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hey Guys!!!
We have a conditional column which is being used as a filter in many reports, we observed that conditional column is taking up huge RAM around 50 GB(Yes you read it right its 50 gigs :/) once we exclude this conditional column schedule refresh works without issues and our capacity is p1 which can handle only 24 gigs, It became too difficult without this column. If we can optimize this column which takes less RAM would resolve our memory issue.
*** We cant make it as a measure since its been used as a filter and mquery also not possible due to complex logic
Any other suggestions would be highly appreciated!!!
*******************************
Not sure what you mean here. Power Query uses as much memory as it needs for the data. If you have 24GB of data after compression, then you have 24GB of data and no amount of formula changes will fix that. When the service says you are out of memory, that is what it means. If it is 24GB, that is millions and millions of rows. By default it is chunking it into 1M segments, but should be 8M record segments if you have "enable large dataset" in the model settings on the service. But once it is in the model, it is still in a temp place. Once all tables are refreshed successfully, then Power BI will swap out the databases. M code has nothing to do with this final step.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingIt depends on how much data you have. If you are at 24GB, you are going to start to have problems in the 25GB P1 capacity anyway as you need more memory to refresh.
If this is a large table, then doing it in M or further upstream will help but only because the engine will take this column into acount when it compresses the table. Calculated columns are compressed, but they are not taken into account when the compression algorithm is done, so they get compressed at however the rest of the table was done.
So converting from a CC to M is prefereable, but it will still take up space. Even from M, when taken into account, it might be highly compressed, but then will cause other columns to be sorted differently and they could grow. You'd have to test.
But there is no magic way to get what you want with no RAM consumption.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingThe Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 48 | |
| 48 | |
| 44 | |
| 16 | |
| 15 |