Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
I have a group of locations with values over time. I'd like to highlight the locations that are trending the worst, not necessarily the locations which have the worst actual numbers. For instance, going from 100 to 90 to 80 to 70 to 60 units in Location A is worse than going from 18 to 19 to 17 to 18 to 17 in Location B. Location B has lower numbers, but not a worse trend relative to Location A.
So my plan was to plot the trends over time and then find the locations with the largest negative slopes or largest negative linear regression values. In Excel, I'd do this by calculating the slope for each row, sorting and then removing all but the worst 5. Note that the worst 5 aren't necessarily the ones with the worst raw values.
In my model, the location names are in a location table, the months are in a date table and the values are a calculated measure from a different table. All have existing relationships.
Is there a method to accomplish this with DAX in the current setup? The examples(1,2,3) I'm seeing for calculating regression all seem to want everything in a single table, which is doable, but seems inefficient.
Hey @Gondi63 ,
here you will find a pbix: https://tommartens-my.sharepoint.com/:u:/g/personal/tom_minceddata_com/ETrmBE_1L9FLipANeN9z33EBM6wr1...
I did not modify your data model, I only added two new measures:
You will find two charts, see the image below:
Two measures are doing the same but different.
Both measures determine the locations to be considered using SELECTCOLUMNS( TOPN( ... )).
TOPN returns the number of Rows to be considered, which is dynamic using the number slicers. SELECTCOLUMNS is removing all unnecessary columns, and a single-columned table remains. I call this table the proxy table.
Two complex parts must be tackled: a generic one and a specific one.
The specific one is determining the slope per location, this has become simpler with the introduction of the new DAX functions LINESTX and LINEST. The specific part is the computation of a numeric expression. This is simple because the numeric expression will be computed in the outer row context. This outer row context can be modified by using calculate.
The outer row context must contain all Locations, but the numeric expression must be computed separately for each location. Creating the table used for the iteration is the generic part that can be more complex than computing the numeric expression. The outer context is responsible for creating a proxy that is used to check if a location must be considered or not. I use the following to create the iterator for the outer context
CALCULATETABLE(
VALUES( 'Location'[Location] )
, ALL( 'Location' )
)
Could you check if the result of the measures is still returning the expected result when you use Market as an additional filter?
I use ADDCOLUMNS (outer context , "slope" , <numeric expression> ) to add a numeric expression to a row, the location.
The integrated measure contains a final step that uses the "original" numeric expression when the current location is part of the proxy table. This approach requires the integration of the computation of the proxy table to each existing measure or the creation of explicit measures. Still, it performs better (of course this depends of the number of rows) in comparison with using the check measure approach and the visual level filter:
Hopefully, this provides what you are looking for.
Regards,
Tom
Hey @Gondi63
yes it's doable.
Create a pbix file that contains sample data, but still reflects your data model (tables, relationships, calculated columns, and measures). Upload the pbix to onedrive, google drive, or dropbox and share the link. If you are using Excel to create the sample data instead of the manual input method share the xlsx as well.
Regards,
Tom
Thanks @TomMartens --- Here is the PBIX and dummy data file. If I were to want the 3 highest slopes of this data, I should get series A, B & C.
Hey,
not sure if I will find time these days, but I will have a closer look during the next weekend.
Regards,
Tom
Hi @TomMartens -- I don't want to bug, so this will be my last bump, but I'm still hoping that I can learn something here! Thanks!