Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have 2 tables.
One is a table of all timestamps per ID where a change has occured. [Lookup]
The other is high resolution timestamp table (30 min intervals) between the start and end dates of an ID. [History]
I have a solution in DAX where I lookup the last change in [Lookup] for each value of [History]
LAST CHANGE = CALCULATE(MAXX(LOOKUP, LOOKUP[UPDATED_AT]), FILTER(LOOKUP,LOOKUP[ID] = HISTORY[ID] && LOOKUP[UPDATED_AT] <= HISTORY[Time Active]))
This works fine, until the datset grows too large (around 50 mil rows) . Then there is not enough memory to perform the DAX calculation. I'm wondering if I can achieve the same within power query which is maybe not so memory hungry
Hi @crispybc try using variable and see results
If this post helps, please consider accept as solution to help other members find it more quickly.
Test LAST CHANGE=
VAR _filteredtable =
FILTER(LOOKUP,LOOKUP[ID] = HISTORY[ID] && LOOKUP[UPDATED_AT] <= HISTORY[Time Active])
VAR _result=
CALCULATE(MAXX(LOOKUP, LOOKUP[UPDATED_AT]),
_filteredtable
)
RETURN
_result
Proud to be a Super User!
Check out the July 2025 Power BI update to learn about new features.