The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
When I either want to make or apply any changes in/from Power Query Editor, it takes hours, and the loading screen shows an unreasonable amount of data usage (see pictures of 4GB & 9,9GB).
I can't comprehend that my file, only containing test data and a few calculations, can be so heavy.Load from ValidLP.LCY
Load from ValidLP.EUR
I have three tables:
A) Main table, "1 VA" (1200KB / 3000 rows)
B) Two Lookup tables, "5. ValidLP.EUR" & "5. ValidLP.LCY" (4000KB each /18000 rows)
I use a few Table.AddColumn together with the List.PositionOf function in the main table to do some lookup.
Not sure they could cause the problem.
Example:
VA_Table
Material |
X |
Y |
Z |
V |
5 ValidLP.EUR
Material | ListPrice |
X | 10 |
Y | 20 |
Z | 30 |
V | 40 |
= Table.AddColumn(#"Add brugerdefineret4", "ListPriceLookup", each #"5 ValidLP_EUR"[LP_EUR]{List.PositionOf(#"5 ValidLP_EUR"[Material],[Material])})
I am not sure which is faster.
Buffer your lists before the above step:
Buff_LP_EUR = List.Buffer(#"5 ValidLP_EUR"[LP_EUR])
Buff_Material = List.Buffer(#"5 ValidLP_EUR"[Material])
or this might suffice:
Table.AddColumn(#"Add brugerdefineret4", "ListPriceLookup", (outer) => Table.SelectRows(#"5 ValidLP_EUR", each [LP_EUR] = outer[Material])[Price]{0})