The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
I got this messy legacy report that is blowing up the memory because of this calculated column
and the table it is being calculated on is just a table (NPD) with 2 dates
the history portion (with the cutoff date works fine)
However the use of a summarize inside a filters inside a calculate on 2 tables thata re 10 M + rows is blowing everything up
i cant even understand what is happening in DAX in order to improve it
The concept is pretty simple, for each compination of dates in NPD it goes into table "aquisicoes" in order to do this calculation : SUM('Aquisições'[VALOR DE VENCIMENTO]) - SUM('Aquisições'[Liquidado Payments]). But somehow they want it dinamically tag the table aquisicoes in order to filter it before doing the calculation so it sums up only the values that fit the criteria for that specific row of dates.
However the aquisicoes table has 10 M + rows and baixas has 3M + so this makeshift boolean mask emulated for loop is taking up too much memory.
Any clue on how to break it down or calculate any other way ? been trying for the past 3 days, managed to cut down memory usage by 3 gb by using different functions and by getting rid of a bunch of useless filters. but cannot get past this current form.
Hi, @Tlessa
Where possible, try pre-aggregating data in the Aquisições and Baixas_Payments tables. This can mean creating intermediate tables or measures that contain aggregate values that can be referenced by calculated columns. This reduces the amount of data that needs to be processed in the calculated column itself. If there are parts of your calculations that don't change often, such as historical data, consider materializing those calculations as static tables with Power Query. This way, you won't have to recalculate the values every time.
If there are parts of your calculations that don't change often, such as historical data, consider materializing those calculations as static tables with Power Query. This way, you won't have to recalculate the values every time. You can check the following link:
DAX query view in Power BI Desktop - Power BI | Microsoft Learn
How to Get Your Question Answered Quickly
Best Regards
Yongkang Hua
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Yeah i have been trying to pre aggregate them and toss the load to PQ however, some of the values in the dax query refference other calculated columns so they dont exist within PQ on load. Just after the data is loaded the columns are calculated in dax... because then and only them the required columns exist.
Still trying to figure this one out =/