The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hello, i manage big data in a Power BI, and with Power Query, the merges take too long time. So I decided to switch on DAX. But before starting, i wanted to know if a lot of NATURELLEFTOUTERJOIN on big calculated table, duplicate table and other operations with table in DAX can affect the performance of the report. Not the refresh but the report.
I just started and i figure out that my file went from 180 000 to 330 000 Ko with just a duplication of table and a LEFT JOIN between 2 big table (because the LEFT JOIN in DAX don't want same name of column in both table and i can't change the columns name in original table, so i have to duplicate the table). I'm worried about the performance that my report may take.
What is your opinion and what are your advices ?
Hi @Skrnz63
Duplicating a table will create a total copy of the original table. It connects to the same data source and repeats all transformation operations, then imports data into the model. So I think it will double the size of the data.
I think perhaps you can try merging a table to itself? It has (Current) suffix for itself when selecting the second table.
Best Regards,
Community Support Team _ Jing
If this post helps, please Accept it as Solution to help other members find it.