Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi. I am trying to create a measure of volatility for prices of products over whatever time period is shown in visual, constrined by a slicer. I thought the below would do it but instead the result is something mystical. The price table has about 6000 products with about 750 dated records each.
Sample data looks like this
Date | Item ID | Price |
14/09/2021 | 2 | 1079087 |
14/09/2021 | 2 | 1425434 |
15/09/2021 | 2 | 732740.3 |
16/09/2021 | 2 | 386393.3 |
17/09/2021 | 2 | 386393.3 |
14/09/2021 | 6 | 1194536 |
14/09/2021 | 6 | 1540883 |
15/09/2021 | 6 | 848189.3 |
16/09/2021 | 6 | 501842.3 |
17/09/2021 | 6 | 234734 |
13/09/2021 | 8 | 1656332 |
14/09/2021 | 8 | 1656332 |
15/09/2021 | 8 | 1656332 |
16/09/2021 | 8 | 617291.3 |
17/09/2021 | 8 | 231325 |
I am trying to get a result for volatility like this
ID | Changes | Records | Calculated | Volatility |
2 | 3 | 5 | =3/(5-1) | 0.75 |
6 | 4 | 5 | =4/(5-1) | 1.00 |
8 | 2 | 5 | =2/(5-1) | 0.5 |
Help as to what I am actually calculating using my formula and what I should do to get the right result would be greatly appreciated.
Solved! Go to Solution.
@TrivialSin I updated the measure to reflect the correct values when broken down by ID:
Measure 3 =
VAR __ItemID = MAX('Table14'[Item ID])
VAR __Table =
SUMMARIZE(
FILTER(ALL('Table14'),[Item ID]=__ItemID),
[Date],
"__DailyChanges",COUNTROWS(DISTINCT('Table14'[Price]))-1 // Get number of changes in same day
)
VAR __Table1 =
ADDCOLUMNS(
FILTER(ALL('Table14'),[Item ID]=__ItemID),
"__ChangedSinceYesterday",
VAR __CurrentPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Item ID]=__ItemID && [Date]=EARLIER([Date])),"__Price",[Price])
VAR __YesterdayPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Item ID]=__ItemID && [Date]=EARLIER([Date])-1),"__Price",[Price])
VAR __MatchTable = INTERSECT(__CurrentPrices,__YesterdayPrices)
VAR __CountYesterday = COUNTROWS(__YesterdayPrices)
VAR __Matches = COUNTROWS(__MatchTable)
RETURN
SWITCH(TRUE(),
ISBLANK(__CountYesterday),0,
ISBLANK(__Matches),1,
0
)
)
RETURN
SUMX(__Table,[__DailyChanges]) + SUMX(SUMMARIZE(__Table1,[Date],[__ChangedSinceYesterday]),[__ChangedSinceYesterday])
@TrivialSin Thought about this a little more and came up with the following, seems to return the correct number of changes, 11. Makes the assumption that if on a current day any price is found that matches any price from yesterday that the price did not change between days but only during the day. If that is not a correct assumption, let me know. So, there is a potential to undercount in the case that a price changes from one day to the next and then the price on that same day changes back to the price from yesterday. That would technically be 2 changes but this would only count it as 1. Otherwise, I think it is accurate, it returns 11 from your sample data which is what I calculated by hand:
Measure 3 =
VAR __Table =
SUMMARIZE(
'Table14',
[Date],
"__DailyChanges",COUNTROWS(DISTINCT('Table14'[Price]))-1 // Get number of changes in same day
)
VAR __Table1 =
ADDCOLUMNS(
'Table14',
"__ChangedSinceYesterday",
VAR __CurrentPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Date]=EARLIER([Date])),"__Price",[Price])
VAR __YesterdayPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Date]=EARLIER([Date])-1),"__Price",[Price])
VAR __MatchTable = INTERSECT(__CurrentPrices,__YesterdayPrices)
VAR __CountYesterday = COUNTROWS(__YesterdayPrices)
VAR __Matches = COUNTROWS(__MatchTable)
RETURN
SWITCH(TRUE(),
ISBLANK(__CountYesterday),0,
ISBLANK(__Matches),1,
0
)
)
RETURN
SUMX(__Table,[__DailyChanges]) + SUMX(SUMMARIZE(__Table1,[Date],[__ChangedSinceYesterday]),[__ChangedSinceYesterday])
Hi Greg. I will be spending several hours going through your last response to see how that works. I think this solution only accounts for having one product though. The formatting in the table on my original post was poor. I have many products in the price table so there was actually an "Item ID" column between date and price.
I am using data from a game as a source of motivation to learn how I can utilise Power BI in my workplace environement. This is in combination with The Definitve Guide to Dax and the wealth of internet material (Just added your book to my digital shelf on Kindle as well) . I have made good progress but I just cant get my head round how to achieve the result for these price changes. This is the first time I have had to reach out for help and your time in responding is greatly appreciated. I wont just cut n paste an answer and then forget about it. I will disect the solution to see how it was achieved.
My current reading makes me think RankX needs to be involved somewhere. My original thoughts were that I would need to create a filtered table containing the records (This could be up to 750 rows spanning the last 18 months) for the product( ItemID) and then compare the current row against the previous row for each row in the filtered table, counting the rows where the price is not equal.
To give some context, the volatility figure is to end up as a column in my current learning experiment below where Item Name maps to an Item ID.
Your data does not allow to calculate the number of changes. For the simple reason that one day can have multiple prices and you don't indicate the temporal order of those prices. If there are 3 prices in a day for a product, then you can have the following cases (just showing some relevant ones):
1) A, A, B -> 1 change
2) A, B, A -> 2 changes
Even if you have 2 prices in a day, it's still not enough to say what the number of changes is. That's because of such cases:
1) DayBefore: A, CurrentDay: A, B -> 1 change
2) DayBefore:A, CurrentDay: B, A -> 2 changes
So, this problem is not well defined.
On top of that, your caclulations seem to be incorrect for the table you show.
I have no aversion to critique but could you also point me toward something positive to aid my learning?
@TrivialSin
Does your result table show the correct value under the Changes column? For Item 6, the change is 1; how did you calculate that?
Appreciate your Kudos✌️!!
It didnt show the right value. Amended. Late night. Thank you.
@TrivialSin I updated the measure to reflect the correct values when broken down by ID:
Measure 3 =
VAR __ItemID = MAX('Table14'[Item ID])
VAR __Table =
SUMMARIZE(
FILTER(ALL('Table14'),[Item ID]=__ItemID),
[Date],
"__DailyChanges",COUNTROWS(DISTINCT('Table14'[Price]))-1 // Get number of changes in same day
)
VAR __Table1 =
ADDCOLUMNS(
FILTER(ALL('Table14'),[Item ID]=__ItemID),
"__ChangedSinceYesterday",
VAR __CurrentPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Item ID]=__ItemID && [Date]=EARLIER([Date])),"__Price",[Price])
VAR __YesterdayPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Item ID]=__ItemID && [Date]=EARLIER([Date])-1),"__Price",[Price])
VAR __MatchTable = INTERSECT(__CurrentPrices,__YesterdayPrices)
VAR __CountYesterday = COUNTROWS(__YesterdayPrices)
VAR __Matches = COUNTROWS(__MatchTable)
RETURN
SWITCH(TRUE(),
ISBLANK(__CountYesterday),0,
ISBLANK(__Matches),1,
0
)
)
RETURN
SUMX(__Table,[__DailyChanges]) + SUMX(SUMMARIZE(__Table1,[Date],[__ChangedSinceYesterday]),[__ChangedSinceYesterday])
Amazing. That gives exactly what I was looking for. I have removed the "ALL" from the various places as I wanted to be able to see the volatility over the filtered period based on a date slicer on the page. As the "live" table has 6000 item IDs and 4.6M records, I didnt fancy my chances on how many calculations that would take to complete a full run either. I will wait a few more weeks until I have absorbed significantly more understanding before tackling optimisation. Thank you so much for this. Hope your book is a good read. I will make sure I leave a review.
@TrivialSin Can you just do a COUNTROWS of the DISTINCT prices? Like:
Number of Price Changes = COUNTROWS(DISTINCT(SELECTCOLUMNS('Table',"__Price",[Price])))
Unfortunately not as many items have a sine wave pattern and so the result from distinct gives a much lower figure than expected.
@TrivialSin So what about a DISTINCT of the date and price columns together?
Number of Price Changes = COUNTROWS(DISTINCT(SELECTCOLUMNS('Table',"__Price",[Price],"__Date",[Date])))
I know you have solved my original puzzle but just wondering what if, I only ever had 1 price per ID, per day, could I follow my original train of thought and create a filtered table with only the entries for the current Item ID and then use your formula above to isolate the changes?
@TrivialSin If you only had 1 price per ID per day then that becomes more of a MTBF sort of thing where you just need to check the "previous" row. See my article on Mean Time Between Failure (MTBF): http://community.powerbi.com/t5/Community-Blog/Mean-Time-Between-Failure-MTBF-and-Power-BI/ba-p/3395....
The basic pattern is:
Column =
VAR __Current = [Value]
VAR __PreviousDate = MAXX(FILTER('Table','Table'[Date] < EARLIER('Table'[Date])),[Date])
VAR __Previous = MAXX(FILTER('Table',[Date]=__PreviousDate),[Value])
RETURN
__Current - __Previous
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
21 | |
15 | |
14 | |
11 | |
7 |
User | Count |
---|---|
25 | |
24 | |
12 | |
12 | |
11 |