Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
Hi all.
I have several huge source table (>20 GB) , and I need to create new table with unic values from [Column 1].
I create new query with List.Combine(Table1[Column1], Table2[Column1]) and several transformation. After Apply changes, PBI upload source tables several times: for main data and for created lists.
Is it possible to reduce number of uploads source tables?
As a source of raw tables I use dataflows.
Solved! Go to Solution.
@Denis_Slav , Can you try this DAX and see if saves time.
distinct(union (all(Table1[Column1]), all(Table2[Column1])))
@Denis_Slav , Can you try this DAX and see if saves time.
distinct(union (all(Table1[Column1]), all(Table2[Column1])))
Thanks. Didn't think in this direction. )
But in that case I have no possibility to transform data. (
@Denis_Slav , yes, but Power query kind of works on the complete table. so once you are done with all data modeling you can try this.
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.