Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
allenind
Frequent Visitor

Dynamic Moving Average Computed Column

Hi, i've been trying to figure out a solution to my problem which is to create a dynamic moving average column for my data which looks like this: 

File TimestampLocationTotal LicensesIn-Use LicensesFree LicensesPercentage Usage
3/10/20 4:45 PMThailand1500214980.001333333
3/10/20 4:45 PMSan Antonio8000188761130.235875
3/10/20 4:45 PMHouston12000443575650.369583333
3/10/20 4:45 PMLondon4000115028500.2875
3/10/20 4:45 PMAngola7002054950.292857143
3/10/20 4:45 PMPerth300068623140.228666667
3/10/20 4:45 PMSingapore5000197230280.3944
3/10/20 4:56 PMThailand1500214980.001333333
3/10/20 4:56 PMSan Antonio8000188161190.235125
3/10/20 4:56 PMHouston12000441475860.367833333
3/10/20 4:56 PMLondon4000116128390.29025
3/10/20 4:56 PMAngola7002044960.291428571
3/10/20 4:56 PMPerth300069223080.230666667
3/10/20 4:56 PMSingapore5000196730330.3934
3/10/20 5:06 PMThailand1500214980.001333333
3/10/20 5:06 PMSan Antonio8000188161190.235125
3/10/20 5:06 PMHouston12000440475960.367
3/10/20 5:06 PMLondon4000116828320.292
3/10/20 5:06 PMAngola7002064940.294285714
3/10/20 5:06 PMPerth300069523050.231666667
3/10/20 5:06 PMSingapore5000196030400.392
3/10/20 5:16 PMThailand1500214980.001333333
3/10/20 5:16 PMSan Antonio8000187761230.234625
3/10/20 5:16 PMHouston12000439576050.36625
3/10/20 5:16 PMLondon4000116528350.29125
3/10/20 5:16 PMAngola7002034970.29
3/10/20 5:16 PMPerth300069223080.230666667
3/10/20 5:16 PMSingapore5000196230380.3924
3/10/20 5:26 PMThailand1500214980.001333333
3/10/20 5:26 PMSan Antonio8000187661240.2345
3/10/20 5:26 PMHouston12000437676240.364666667
3/10/20 5:26 PMLondon4000117428260.2935
3/10/20 5:26 PMAngola7002014990.287142857
3/10/20 5:26 PMPerth300069223080.230666667
3/10/20 5:26 PMSingapore5000196630340.3932

 

I've been trying to create a dynamic average of every 10 minutes, per Location. Im completely stumped, maybe this is because of my inexperience with DAX or PowerBI in general, but I'm hoping to put the moving average in one column for each location, any help would be appreciated! My data structure is in that format all the time it goes from thailand to singapore everytime it is being updated.

 

Here is the excel file of the dataset if anyone wants to give it a go: Dynamic Average Sample 

 

1 ACCEPTED SOLUTION

@allenind ,

 

try a new column like

avergageX(filter(Table,table[File Timestamp] <= earlier(table[File Timestamp]) && table[Location] = earlier(table[Location])),table[In-Use Licenses])

 

View solution in original post

6 REPLIES 6
amitchandak
Super User
Super User

What is the expected output? What to mean by 10 min average. can you explain with an example.

 

TomMartens
Super User
Super User

Hey @allenind ,

 

can you please describe the expected outcome.

 

Regards,

Tom

 



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

Hello @TomMartens

 

File TimestampLocationTotal LicensesIn-Use LicensesFree LicensesPercentage UsageMoving Average
3/10/20 4:45 PMThailand1500214980.001333333= 2/1
3/10/20 4:45 PMSan Antonio8000188761130.235875= 1887/1
3/10/20 4:45 PMHouston12000443575650.369583333= 4435/1
3/10/20 4:45 PMLondon4000115028500.2875=1150/1
3/10/20 4:45 PMAngola7002054950.292857143= 205 /1
3/10/20 4:45 PMPerth300068623140.228666667= 686/1
3/10/20 4:45 PMSingapore5000197230280.3944= 1972/1
3/10/20 4:56 PMThailand1500214980.001333333 = 2+2 /2
3/10/20 4:56 PMSan Antonio8000188161190.235125= 1887+ 1881 /2
3/10/20 4:56 PMHouston12000441475860.367833333=4435+ 4414 /2
3/10/20 4:56 PMLondon4000116128390.29025

= 1161 +1150 /2

3/10/20 4:56 PMAngola7002044960.291428571= 205+ 204 /2
3/10/20 4:56 PMPerth300069223080.230666667= 692 + 686 /2
3/10/20 4:56 PMSingapore5000196730330.3934= 1967 +1972 /2
3/10/20 5:06 PMThailand1500214980.001333333 = 2+2+2 /3
3/10/20 5:06 PMSan Antonio8000188161190.235125= 1881+1887+ 1881 /3
3/10/20 5:06 PMHouston12000440475960.367=4404+4435+ 4414 /3
3/10/20 5:06 PMLondon4000116828320.292

= 1168+1161 +1150 /3

3/10/20 5:06 PMAngola7002064940.294285714= 206+205+ 204 /3
3/10/20 5:06 PMPerth300069523050.231666667= 695+692 + 686 /3
3/10/20 5:06 PMSingapore5000196030400.392= 1960+1967 +1972 /3

 

I want another column, wherein it takes the "In Use Column Values per Location" and Adds the succeeding entries and then overall averages it depending on the number of iterations it has gone through. Is this at all possible?

 

@allenind ,

 

try a new column like

avergageX(filter(Table,table[File Timestamp] <= earlier(table[File Timestamp]) && table[Location] = earlier(table[Location])),table[In-Use Licenses])

 

Thank you so much @amitchandak! This worked perfectly. More power to you good sir! 😁

 

This is an example of what I want to achieve @amitchandak! Thank you for the fast response 🙂 

 

Helpful resources

Announcements
Sept PBI Carousel

Power BI Monthly Update - September 2024

Check out the September 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Sept NL Carousel

Fabric Community Update - September 2024

Find out what's new and trending in the Fabric Community.