Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hi guys,
I've this dataset:
| Session Type | Film_ID | Start_Date | End_Date |
| Action | FI100692453 | 2013-09-04 12:40:29.000 | 2013-09-04 15:56:18.000 |
| Action | FI100692479 | 2013-09-04 14:21:27.000 | 2013-09-06 16:56:03.000 |
| Action | FI100692503 | 2013-09-04 15:07:53.000 | 2013-09-04 15:31:39.000 |
| Action | FI100692516 | 2013-09-04 15:41:28.000 | 2013-09-06 10:12:25.000 |
| Action | FI100692529 | 2013-09-04 16:08:34.000 | 2013-09-04 17:07:08.000 |
| Action | FI100692538 | 2013-09-04 16:11:10.000 | 2013-09-04 17:28:22.000 |
| Action | FI100692541 | 2013-09-04 16:13:25.000 | 2013-09-06 10:12:25.000 |
| Action | FI100692599 | 2013-09-04 19:41:49.000 | 2013-09-04 20:23:15.000 |
| Action | FI100692647 | 2013-09-05 01:05:18.000 | 2013-09-05 03:48:43.000 |
| Action | FI100692737 | 2013-09-05 10:27:06.000 | 2013-09-05 10:44:05.000 |
| Action | FI100692835 | 2013-09-05 15:20:12.000 | 2013-09-06 12:10:45.000 |
| Action | FI100692884 | 2013-09-05 19:42:58.000 | 2013-09-05 23:58:35.000 |
| Action | FI100692964 | 2013-09-06 02:22:51.000 | 2013-09-06 02:28:08.000 |
| Action | FI100693100 | 2013-09-06 12:22:54.000 | 2013-09-06 13:31:22.000 |
| Action | FI100693103 | 2013-09-06 12:23:00.000 | 2013-09-06 13:10:43.000 |
Basically this represents a dataset from a cinema that shows the Films_ID session opened and closed date.
I'm tryint to create tow measures to insert into a Line and Clustered Column Chart:
Columns:
- Number of Sessions Open in Week
- Number of Sessions Close in Week
Lines (Rolling Average):
- Average number of Films_ID open per week (based on last 6 months)
- Average number of Films_ID closed per week (based on last 6 months)
To calculate the Columns Measures I create the Column Year, Month and Week for Open and Close Dates. To calculate the Number of Sessions Close in Week I use this formula:
Close_Sessions = SUMX(SUMMARIZE('Movies_Data';Movies_Data[Week_End_Date];Movies_Data[Month_End_Date];Movies_Data[Year_End_Date];"CloseMovies";DISTINCTCOUNT(Movies_Data[Film_ID])); [CloseMovies])
Don't know if this formula is the best to calculate the measure that I want. But my question is:
How can I calculate the Rolling Average based on my data and the measures that I create above?
Thanks for your support 🙂
Solved! Go to Solution.
Hi @SaCvP125,
If I understand you correctly, you can firstly create a YearMonth column for for Open and Close Dates.
YearMonth_End_Date = Movies_Data[Year_End_Date]*12+Movies_Data[Month_End_Date]
Then you should be able to use the formula below to create a new measure to calculate the Rolling Average in your scenario.
Rolling Average =
VAR Close_Sessions_Last_6_Month =
CALCULATE (
DISTINCTCOUNT ( Movies_Data[Session_ID] ),
FILTER (
ALL ( Movies_Data ),
Movies_Data[Week_End_Date] <= MAX ( Movies_Data[Week_End_Date] )
&& Movies_Data[YearMonth_End_Date]
>= MAX ( Movies_Data[YearMonth_End_Date] ) - 6
)
)
VAR Weeks_Last_6_Month =
CALCULATE (
DISTINCTCOUNT ( Movies_Data[Week_End_Date] ),
FILTER (
ALL ( Movies_Data ),
Movies_Data[Week_End_Date] <= MAX ( Movies_Data[Week_End_Date] )
&& Movies_Data[YearMonth_End_Date]
>= MAX ( Movies_Data[YearMonth_End_Date] ) - 6
)
)
RETURN
DIVIDE ( Close_Sessions_Last_6_Month, Weeks_Last_6_Month, 0 )
Here is the sample pbix file for your reference. ![]()
Regards
Hi @SaCvP125,
If I understand you correctly, you can firstly create a YearMonth column for for Open and Close Dates.
YearMonth_End_Date = Movies_Data[Year_End_Date]*12+Movies_Data[Month_End_Date]
Then you should be able to use the formula below to create a new measure to calculate the Rolling Average in your scenario.
Rolling Average =
VAR Close_Sessions_Last_6_Month =
CALCULATE (
DISTINCTCOUNT ( Movies_Data[Session_ID] ),
FILTER (
ALL ( Movies_Data ),
Movies_Data[Week_End_Date] <= MAX ( Movies_Data[Week_End_Date] )
&& Movies_Data[YearMonth_End_Date]
>= MAX ( Movies_Data[YearMonth_End_Date] ) - 6
)
)
VAR Weeks_Last_6_Month =
CALCULATE (
DISTINCTCOUNT ( Movies_Data[Week_End_Date] ),
FILTER (
ALL ( Movies_Data ),
Movies_Data[Week_End_Date] <= MAX ( Movies_Data[Week_End_Date] )
&& Movies_Data[YearMonth_End_Date]
>= MAX ( Movies_Data[YearMonth_End_Date] ) - 6
)
)
RETURN
DIVIDE ( Close_Sessions_Last_6_Month, Weeks_Last_6_Month, 0 )
Here is the sample pbix file for your reference. ![]()
Regards
@v-ljerr-msftmany thanks for your response 🙂 I'm getting this error when I try your solution into my data:
The expression refers to multiple columns. Multiple columns cannot be converted to a scalar value.
Hi,
On the X-axis of your visual, do you want to see the wek number? Please confirm.
Yes 😉
Hi,
Based on the table that you have pasted above, please show (in a simple table) what the result (numbers) should be for each of the 4 measures that you want to compute.
Using Excel, I've this simple summarize table:
| Year_End | Month_End | Week_End | Count of Film_ID | Moving_Average |
| 2017 | 8 | 32 | 4 | |
| 2017 | 9 | 36 | 4 | |
| 2017 | 9 | 37 | 4 | 4 |
| 2017 | 9 | 38 | 3 | 4 |
| 2017 | 9 | 39 | 8 | 3,5 |
| 2017 | 10 | 40 | 4 | 5,5 |
| 2017 | 10 | 41 | 6 | 6 |
| 2017 | 10 | 42 | 7 | 5 |
That allows me to create this chart:
This creates two measures from the four that I want:
- Number of Filmes ended by Week
- Rolling Average from the last two 6 weeks
Hi,
In the original dataset that you pasted, all End dates fall in week number 36 so even after i write my measure for the rolling 6 months average, i will not be able to test my result. Paste the dataset from where you have built your expected result table.
So, i will need a dataset whch spans multiple weeks/months so that i can compute the rolling 6 months average week over week. Please share a dataset which spans multiple months. I'll try to use the Date/time intelligence functions to get the desired result.
Thanks! For your support. I extract a sample from the last 6 months
| Session_ID | Start_Date | End_Date |
| FI646517 | 22/05/2017 | 22/05/2017 |
| FI622545 | 20/03/2017 | 06/04/2017 |
| FI639607 | 08/05/2017 | 08/05/2017 |
| FI662640 | 29/06/2017 | 29/06/2017 |
| FI706498 | 11/10/2017 | 11/10/2017 |
| FI627592 | 03/04/2017 | 03/04/2017 |
| FI622662 | 20/03/2017 | 20/03/2017 |
| FI669224 | 12/07/2017 | 12/07/2017 |
| FI690122 | 29/08/2017 | 29/08/2017 |
| FI677357 | 31/07/2017 | 31/07/2017 |
| FI685754 | 20/08/2017 | 20/08/2017 |
| FI667468 | 09/07/2017 | 09/07/2017 |
| FI690091 | 29/08/2017 | 29/08/2017 |
| FI633245 | 19/04/2017 | 19/04/2017 |
| FI652780 | 04/06/2017 | 04/06/2017 |
| FI671628 | 18/07/2017 | 18/07/2017 |
| FI693442 | 06/09/2017 | 07/09/2017 |
| FI657914 | 21/06/2017 | 21/06/2017 |
| FI683804 | 14/08/2017 | 21/08/2017 |
| FI707257 | 12/10/2017 | 12/10/2017 |
| FI697007 | 17/09/2017 | 17/09/2017 |
| FI648608 | 27/05/2017 | 27/05/2017 |
| FI672588 | 19/07/2017 | 19/07/2017 |
| FI701468 | 27/09/2017 | 27/09/2017 |
| FI701841 | 28/09/2017 | 28/09/2017 |
| FI659885 | 26/06/2017 | 26/06/2017 |
| FI664996 | 03/07/2017 | 03/07/2017 |
| FI696996 | 17/09/2017 | 17/09/2017 |
| FI663874 | 29/06/2017 | 30/06/2017 |
| FI642132 | 13/05/2017 | 13/05/2017 |
| FI690987 | 31/08/2017 | 31/08/2017 |
| FI673621 | 21/07/2017 | 25/07/2017 |
| FI661844 | 28/06/2017 | 29/06/2017 |
| FI621912 | 18/03/2017 | 20/03/2017 |
| FI661818 | 28/06/2017 | 28/06/2017 |
| FI684195 | 16/08/2017 | 16/08/2017 |
| FI642139 | 13/05/2017 | 13/05/2017 |
| FI705325 | 08/10/2017 | 08/10/2017 |
| FI628591 | 06/04/2017 | 06/04/2017 |
| FI678653 | 03/08/2017 | 03/08/2017 |
| FI685904 | 21/08/2017 | 21/08/2017 |
| FI692017 | 03/09/2017 | 06/09/2017 |
| FI679778 | 05/08/2017 | 05/08/2017 |
| FI693430 | 06/09/2017 | 04/10/2017 |
| FI663261 | 29/06/2017 | 29/06/2017 |
| FI641600 | 12/05/2017 | 12/05/2017 |
| FI626828 | 31/03/2017 | 31/03/2017 |
| FI636249 | 27/04/2017 | 28/04/2017 |
| FI633574 | 20/04/2017 | 20/04/2017 |
| FI693142 | 06/09/2017 | 06/09/2017 |
| FI625161 | 26/03/2017 | 27/03/2017 |
| FI656606 | 17/06/2017 | 18/06/2017 |
| FI673114 | 20/07/2017 | 20/07/2017 |
| FI664533 | 01/07/2017 | 01/07/2017 |
| FI625106 | 26/03/2017 | 27/03/2017 |
| FI686182 | 21/08/2017 | 21/08/2017 |
| FI670532 | 16/07/2017 | 16/07/2017 |
| FI639874 | 08/05/2017 | 08/05/2017 |
| FI682036 | 10/08/2017 | 11/08/2017 |
Hi @SaCvP125,
See of my solution here helps.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.