Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hello to all!
My problem today is to calculate an average measure over 4 time slots for a six mounth period of collected data.
Time slots are: 0 - 6 am; 7-12 am; 13-18 pm and 19-24 pm
The goal is to filter my data on a six mounth timestamp using and comparing average between this time slots.
As anyone experienced some kind of problem? Thanks in advance for any suggestion
Ago
Solved! Go to Solution.
Hi @AgoZ_KH,
You can extract hour from datetime field and create a calculated column based on if statement conditions, then you can create table visual with new category column and aggregated(average) raw table value fields to achieve your requirement:
Time Range =
IF (
HOUR ( [operation_timestamp] ) >= 19,
4,
IF (
HOUR ( [operation_timestamp] ) >= 13,
3,
IF ( HOUR ( [operation_timestamp] ) >= 7, 2, 1 )
)
)
Regards,
Xiaoxin Sheng
Hello to all!! Thanks to @v-shex-msft ; @lbendlin & @Greg_Deckler for your suggestions!
I had used some of suggestion for resolve my issue in that way:
First I have extracted for each row the Time in a new calculated column: Time = FORMAT(myDB[operation_timestamp],"hh:mm:ss")
Then I have calculated Hour for each row in a second calculated colum: DayTime = HOUR(myDB[Time]) and recorded the data in text format
Third passage a new calculated column usin the DAX command SWITCH: DayQuarter = SWITCH([DayTime],"0","12am - 6am","1","12am - 6am","2","12am - 6am","3","12am - 6am","4","12am - 6am","5","12am - 6am","6","6 am - 12pm","7","6 am - 12pm","8","6 am - 12pm","9","6 am - 12pm","10","6 am - 12pm","11","6 am - 12pm","12","12pm - 6 pm","13","12pm - 6 pm","14","12pm - 6 pm","15","12pm - 6 pm","16","12pm - 6 pm","17","12pm - 6 pm","6pm - 12 am")
Obviously I don't think it's the most elegant and simple way to solve my problem; but at the moment it works even if all the suggested approaches should equally achieve the intended goal.
Of course I don't think it's the most elegant and simple way to solve my problem; but at the moment it works even if all the suggested approaches should equally achieve the intended goal.
I obviously reserve the right to go back to the solution to see if I can find an even easier way ...
Ago
@AgoZ_KH Can you share sample data and expected output? Would make it more clear in my head.
Suppose my table is as shown:
Field1 | Field2 | Field3 | operation_timestamp |
759,3254 | 103,6743 | 3,45 | 27/02/2020 08:00 |
756,9019 | 104,9481 | 4,69 | 27/02/2020 08:00 |
324,9738 | 102,6734 | 21,98 | 27/02/2020 08:01 |
664,9268 | 105,0859 | 14,28 | 27/02/2020 08:01 |
608,652 | 103,8442 | 11,52 | 27/02/2020 08:02 |
710,9264 | 103,9425 | 5,56 | 27/02/2020 08:02 |
236,5321 | 104,1536 | 19,28 | 27/02/2020 08:02 |
272,6447 | 102,0826 | 24,69 | 27/02/2020 08:03 |
305,73 | 103,3464 | 22,4 | 27/02/2020 08:04 |
399,3463 | 98,92466 | 30,13 | 27/02/2020 08:04 |
301,5051 | 103,6406 | 20,8 | 27/02/2020 08:05 |
233,4147 | 62,07999 | 192,59 | 27/02/2020 08:05 |
750,2224 | 104,1767 | 5,16 | 28/02/2020 14:00 |
790,4665 | 103,965 | 5,31 | 28/02/2020 14:00 |
712,0485 | 105,5373 | 13,93 | 28/02/2020 14:01 |
370,0733 | 103,5125 | 24,33 | 28/02/2020 14:01 |
388,4965 | 104,0848 | 10,71 | 28/02/2020 14:02 |
259,4267 | 103,7636 | 17,59 | 28/02/2020 14:02 |
722,3729 | 104,4462 | 5,07 | 28/02/2020 14:03 |
269,6838 | 100,0554 | 26,86 | 28/02/2020 14:03 |
340,3696 | 103,8541 | 24,7 | 28/02/2020 14:04 |
844,4793 | 104,3328 | 4,65 | 29/02/2020 21:00 |
772,1326 | 104,3125 | 5,21 | 29/02/2020 21:00 |
677,9255 | 104,615 | 13,12 | 29/02/2020 21:01 |
323,4487 | 103,2005 | 24,05 | 29/02/2020 21:01 |
353,381 | 103,8914 | 178 | 29/02/2020 21:02 |
218,8718 | 104,1466 | 15,05 | 29/02/2020 21:02 |
706,2962 | 104,2135 | 5,43 | 29/02/2020 21:03 |
250,2747 | 100,4057 | 25,99 | 29/02/2020 21:03 |
320,1574 | 103,6253 | 22,68 | 29/02/2020 21:04 |
I need to classify each row with a Time_slot number, i.e. 1 for hours between 0-6; 2 between 7-12; 3 between 13-18; 4 between 19-24, so I can in the report show the average for each slot (and field):
Timeslot | Average F1 |
2 | 464,5733 |
3 | 511,4622 |
4 | 496,3297 |
Like suggested the user Ibedlin I need probably a calculated column for each row with the Timeslot number based on the operation_timestamp column.
Hope this is more clear ...
Ago
Hi @AgoZ_KH,
You can extract hour from datetime field and create a calculated column based on if statement conditions, then you can create table visual with new category column and aggregated(average) raw table value fields to achieve your requirement:
Time Range =
IF (
HOUR ( [operation_timestamp] ) >= 19,
4,
IF (
HOUR ( [operation_timestamp] ) >= 13,
3,
IF ( HOUR ( [operation_timestamp] ) >= 7, 2, 1 )
)
)
Regards,
Xiaoxin Sheng
Here's how your table "Buckets" would look like after the transform in Power Query
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("dZRNbiwxCISv8jRrpPCP/a4S5f7XCI1xL9Lu1Vgjfw1FFf7+/oRNEDb9wIdQwEMljwJq+cPxhfzFyPgPx3/Ezw9chMNEmkUoTB2URwWfb4RwXgsZRXDWkKscE8zxRKgQ90TYF2KAw6qeAr8jOMCNW8hQrSNB/fWH4KWE8CqytU/lS7SB+RvB4mDC1NrJ5LpK89hWI5GKVaPF4+D6+nlesuaVikO6K9FqkBn0eV/X/TmvaxcxRypSv0oIAskrgjkYNNq2K1ZXCAcd1tIzFFQ6PGXEnOXIZLCDEOuoIDCz7nGFRw2Yqti4CdI7XBMh27dtiS9HhF4BymZ0WAfFZA0uUXkiKygSmP3LHrBR2Z6OyDsyBujstjRN1JVMiENj7XsulrJHVwlfUYke1wmJNFmC92apOpd6jCexksI+wUdvVqqyWmR2GIcJd7gUQXz63hNTWuoPRVZWcpVAY0q3JVJhzwDX2OZNMN2mZOZJ2DdBvVi1OEfCIzK4tgfsZMtF4ifRlnDmUceeLyO2i3hoqxHLdRo782NSBTPG835bSANG0NjxXXtFdizRFqIDT+dGmGQpr1f1D9EWXjsS/T5k+NHKCIParyMiuadkcb/ZbLJeCD9ISQ9/fgE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Field1 = _t, Field2 = _t, Field3 = _t, operation_timestamp = _t]),
#"Split Column by Delimiter" = Table.SplitColumn(Source, "operation_timestamp", Splitter.SplitTextByDelimiter("/", QuoteStyle.Csv), {"operation_timestamp.1", "operation_timestamp.2", "operation_timestamp.3"}),
#"Added Custom" = Table.AddColumn(#"Split Column by Delimiter", "Custom", each [operation_timestamp.2] & "/" & [operation_timestamp.1] & "/" & [operation_timestamp.3]),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom",{{"Custom", type datetime}}),
#"Removed Other Columns" = Table.SelectColumns(#"Changed Type",{"Field1", "Field2", "Field3", "Custom"}),
#"Replaced Value" = Table.ReplaceValue(#"Removed Other Columns",",","",Replacer.ReplaceText,{"Field1", "Field2", "Field3"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Replaced Value",{{"Field1", Int64.Type}, {"Field2", Int64.Type}, {"Field3", Int64.Type}}),
#"Added Custom1" = Table.AddColumn(#"Changed Type1", "Bucket", each Number.RoundDown(4* Number.From(Time.From([Custom]))))
in
#"Added Custom1"
And then the calculated column would be
Assuming you have a datetime field for your data you want to do two things
1. Create a calculated column that only uses the date value of your field. Then connect that date field to your calendar table
2. Create another calculated column that creates the quarter day buckets. For example
DayQuarter =MOD([datetimefield]*4,4)
This will give you buckets of 0,1,2 and 3 which you can then map to your time ranges
DayQuarter = SWITCH(MOD([datetimefield]*4,4),0,"12am - 6am",1,"6 am - 12pm",2,"12pm - 6 pm","6pm - 12 am"
User | Count |
---|---|
92 | |
85 | |
77 | |
66 | |
62 |
User | Count |
---|---|
110 | |
95 | |
95 | |
64 | |
59 |