Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi, i've been trying to figure out a solution to my problem which is to create a dynamic moving average column for my data which looks like this:
File Timestamp | Location | Total Licenses | In-Use Licenses | Free Licenses | Percentage Usage |
3/10/20 4:45 PM | Thailand | 1500 | 2 | 1498 | 0.001333333 |
3/10/20 4:45 PM | San Antonio | 8000 | 1887 | 6113 | 0.235875 |
3/10/20 4:45 PM | Houston | 12000 | 4435 | 7565 | 0.369583333 |
3/10/20 4:45 PM | London | 4000 | 1150 | 2850 | 0.2875 |
3/10/20 4:45 PM | Angola | 700 | 205 | 495 | 0.292857143 |
3/10/20 4:45 PM | Perth | 3000 | 686 | 2314 | 0.228666667 |
3/10/20 4:45 PM | Singapore | 5000 | 1972 | 3028 | 0.3944 |
3/10/20 4:56 PM | Thailand | 1500 | 2 | 1498 | 0.001333333 |
3/10/20 4:56 PM | San Antonio | 8000 | 1881 | 6119 | 0.235125 |
3/10/20 4:56 PM | Houston | 12000 | 4414 | 7586 | 0.367833333 |
3/10/20 4:56 PM | London | 4000 | 1161 | 2839 | 0.29025 |
3/10/20 4:56 PM | Angola | 700 | 204 | 496 | 0.291428571 |
3/10/20 4:56 PM | Perth | 3000 | 692 | 2308 | 0.230666667 |
3/10/20 4:56 PM | Singapore | 5000 | 1967 | 3033 | 0.3934 |
3/10/20 5:06 PM | Thailand | 1500 | 2 | 1498 | 0.001333333 |
3/10/20 5:06 PM | San Antonio | 8000 | 1881 | 6119 | 0.235125 |
3/10/20 5:06 PM | Houston | 12000 | 4404 | 7596 | 0.367 |
3/10/20 5:06 PM | London | 4000 | 1168 | 2832 | 0.292 |
3/10/20 5:06 PM | Angola | 700 | 206 | 494 | 0.294285714 |
3/10/20 5:06 PM | Perth | 3000 | 695 | 2305 | 0.231666667 |
3/10/20 5:06 PM | Singapore | 5000 | 1960 | 3040 | 0.392 |
3/10/20 5:16 PM | Thailand | 1500 | 2 | 1498 | 0.001333333 |
3/10/20 5:16 PM | San Antonio | 8000 | 1877 | 6123 | 0.234625 |
3/10/20 5:16 PM | Houston | 12000 | 4395 | 7605 | 0.36625 |
3/10/20 5:16 PM | London | 4000 | 1165 | 2835 | 0.29125 |
3/10/20 5:16 PM | Angola | 700 | 203 | 497 | 0.29 |
3/10/20 5:16 PM | Perth | 3000 | 692 | 2308 | 0.230666667 |
3/10/20 5:16 PM | Singapore | 5000 | 1962 | 3038 | 0.3924 |
3/10/20 5:26 PM | Thailand | 1500 | 2 | 1498 | 0.001333333 |
3/10/20 5:26 PM | San Antonio | 8000 | 1876 | 6124 | 0.2345 |
3/10/20 5:26 PM | Houston | 12000 | 4376 | 7624 | 0.364666667 |
3/10/20 5:26 PM | London | 4000 | 1174 | 2826 | 0.2935 |
3/10/20 5:26 PM | Angola | 700 | 201 | 499 | 0.287142857 |
3/10/20 5:26 PM | Perth | 3000 | 692 | 2308 | 0.230666667 |
3/10/20 5:26 PM | Singapore | 5000 | 1966 | 3034 | 0.3932 |
I've been trying to create a dynamic average of every 10 minutes, per Location. Im completely stumped, maybe this is because of my inexperience with DAX or PowerBI in general, but I'm hoping to put the moving average in one column for each location, any help would be appreciated! My data structure is in that format all the time it goes from thailand to singapore everytime it is being updated.
Here is the excel file of the dataset if anyone wants to give it a go: Dynamic Average Sample
Solved! Go to Solution.
try a new column like
avergageX(filter(Table,table[File Timestamp] <= earlier(table[File Timestamp]) && table[Location] = earlier(table[Location])),table[In-Use Licenses])
What is the expected output? What to mean by 10 min average. can you explain with an example.
Hey @allenind ,
can you please describe the expected outcome.
Regards,
Tom
Hello @TomMartens
File Timestamp | Location | Total Licenses | In-Use Licenses | Free Licenses | Percentage Usage | Moving Average |
3/10/20 4:45 PM | Thailand | 1500 | 2 | 1498 | 0.001333333 | = 2/1 |
3/10/20 4:45 PM | San Antonio | 8000 | 1887 | 6113 | 0.235875 | = 1887/1 |
3/10/20 4:45 PM | Houston | 12000 | 4435 | 7565 | 0.369583333 | = 4435/1 |
3/10/20 4:45 PM | London | 4000 | 1150 | 2850 | 0.2875 | =1150/1 |
3/10/20 4:45 PM | Angola | 700 | 205 | 495 | 0.292857143 | = 205 /1 |
3/10/20 4:45 PM | Perth | 3000 | 686 | 2314 | 0.228666667 | = 686/1 |
3/10/20 4:45 PM | Singapore | 5000 | 1972 | 3028 | 0.3944 | = 1972/1 |
3/10/20 4:56 PM | Thailand | 1500 | 2 | 1498 | 0.001333333 | = 2+2 /2 |
3/10/20 4:56 PM | San Antonio | 8000 | 1881 | 6119 | 0.235125 | = 1887+ 1881 /2 |
3/10/20 4:56 PM | Houston | 12000 | 4414 | 7586 | 0.367833333 | =4435+ 4414 /2 |
3/10/20 4:56 PM | London | 4000 | 1161 | 2839 | 0.29025 | = 1161 +1150 /2 |
3/10/20 4:56 PM | Angola | 700 | 204 | 496 | 0.291428571 | = 205+ 204 /2 |
3/10/20 4:56 PM | Perth | 3000 | 692 | 2308 | 0.230666667 | = 692 + 686 /2 |
3/10/20 4:56 PM | Singapore | 5000 | 1967 | 3033 | 0.3934 | = 1967 +1972 /2 |
3/10/20 5:06 PM | Thailand | 1500 | 2 | 1498 | 0.001333333 | = 2+2+2 /3 |
3/10/20 5:06 PM | San Antonio | 8000 | 1881 | 6119 | 0.235125 | = 1881+1887+ 1881 /3 |
3/10/20 5:06 PM | Houston | 12000 | 4404 | 7596 | 0.367 | =4404+4435+ 4414 /3 |
3/10/20 5:06 PM | London | 4000 | 1168 | 2832 | 0.292 | = 1168+1161 +1150 /3 |
3/10/20 5:06 PM | Angola | 700 | 206 | 494 | 0.294285714 | = 206+205+ 204 /3 |
3/10/20 5:06 PM | Perth | 3000 | 695 | 2305 | 0.231666667 | = 695+692 + 686 /3 |
3/10/20 5:06 PM | Singapore | 5000 | 1960 | 3040 | 0.392 | = 1960+1967 +1972 /3 |
I want another column, wherein it takes the "In Use Column Values per Location" and Adds the succeeding entries and then overall averages it depending on the number of iterations it has gone through. Is this at all possible?
try a new column like
avergageX(filter(Table,table[File Timestamp] <= earlier(table[File Timestamp]) && table[Location] = earlier(table[Location])),table[In-Use Licenses])
Thank you so much @amitchandak! This worked perfectly. More power to you good sir! 😁
This is an example of what I want to achieve @amitchandak! Thank you for the fast response 🙂
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
105 | |
99 | |
98 | |
38 | |
37 |
User | Count |
---|---|
154 | |
120 | |
73 | |
73 | |
63 |