Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hello friends,
I am looking for a way to convert an aggregated or cumulated dataset into a dataset with daily values that show the daily change instead of aggegrated value.
Ideally with the power query editor as an additional column or in powerbi as a calculated column. I dont want to have it as a measure. Another thing to consider are different groups. In the exampe I only have 2 but there can be hundreds.
It should look like this
I appreciate any help. I thought it to be simple but for some reasons nothing worked so far.
Solved! Go to Solution.
Hi @raymond ,
There may not be a better way to do this, you can only use the following measure and manual method to obtain the correct value... As the data in your example, it keep the original value if the value of 2020-01-10 is less than the value in previous date of 2020-01-10 . Otherwise, get the value from the following measure.
2020-01-01 = 400
2020-01-10 = 300
Non-aggregate = 'group'[Value]-CALCULATE(MAX('group'[Value]),FILTER('group','group'[GroupID]= EARLIER('group'[GroupID]) &&'group'[Date]<EARLIER('group'[Date])))
Best Regards
Rena
Hello @raymond,
You can create a calculated column as shown below:
Non-aggregate = 'group'[Value]-CALCULATE(MAX('group'[Value]),FILTER('group','group'[GroupID]= EARLIER('group'[GroupID]) &&'group'[Date]<EARLIER('group'[Date])))
Best regards
Rena
Thanks @amitchandak and @Anonymous
this is a great an complicated formula. I have however experienced something weird.
If I slice the data by a one single group and by date it works just fine but if I remove the group and analyze the data only by date (not with group) the numbers do not match up.
I figured, that at some point it is substracting to much of the values so that the non aggregated value becomes negative on a daily level. I will brain a little bit in to it and will let you know. But perhaps you have an idea right away 🙂
Hi @raymond ,
Thanks for your feedback. The non aggregated value with created calculated column is correct on a daily level when I remove the group and filter the data based on date...
Could you please provide some screenshots that display the negative value when you slice the data? Thank you.
Best Regards
Rena
Thanks again @Anonymous .
You are right! I have checked it again as well. But I have found something weird in my data. Due to data cleansing some accumulated values have later on been correct thus reduced. e.g.
2020-01-01 = 400
2020-01-10 = 300
an idea how to cope with such cases as well?
Hi @raymond ,
There may not be a better way to do this, you can only use the following measure and manual method to obtain the correct value... As the data in your example, it keep the original value if the value of 2020-01-10 is less than the value in previous date of 2020-01-10 . Otherwise, get the value from the following measure.
2020-01-01 = 400
2020-01-10 = 300
Non-aggregate = 'group'[Value]-CALCULATE(MAX('group'[Value]),FILTER('group','group'[GroupID]= EARLIER('group'[GroupID]) &&'group'[Date]<EARLIER('group'[Date])))
Best Regards
Rena
Hi @raymond ,
You can create one calculated column as below:
Non-aggregate = 'group'[Value]-CALCULATE(MAX('group'[Value]),FILTER('group','group'[GroupID]= EARLIER('group'[GroupID]) &&'group'[Date]<EARLIER('group'[Date])))
Best Regards
Rena
Try New columns like
Max date = maxx(filter(table1,table[group_id]= earlier(table[group_id]) && table[date]< earlier(table[date])),table[date])
Non cumm value = [value] -maxx(filter(table1,table[group_id]= earlier(table[group_id]) && table[date]< earlier(table[Max date ])),table[value])
or
Non cumm value = [value] -maxx(filter(table1,table[group_id]= earlier(table[group_id]) && table[max date]< earlier(table[date ])),table[value])
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.