Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I am relatively new to BI, more used to excel, and am having a difficult time showing overlapping shift. Formula and screenshot of excel below.
F(F3=F2, IF(H3=I2, "follow", IF(H3<I2, "OVERLAP", "After")), "---")
Any help would be greatly appreciated.
Solved! Go to Solution.
Hi @raicardi ,
Approve with @MahyarTF .
Here is my solution:
You can create index column in Power Query:
Here is the M code:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMtQ31DcyMDJS0lEytLC01DU00jU2UDAwtzIwACJUUUNTqGisDi6NhsZWxpgajQyhohCNRuTaiE0jUTYak2sjNo2EbIwFAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Work Date" = _t, #"In Time" = _t, #"Out Time" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Work Date", type date}, {"In Time", type datetime}, {"Out Time", type datetime}}),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1, Int64.Type)
in
#"Added Index"
Then create and apply the measure:
Measure =
var _a = MAX('Table'[Index])
var _b = FILTER(ALL('Table'),[Index]=_a-1)
return IF( MAXX(_b,'Table'[Work Date])=MAX('Table'[Work Date]),
SWITCH(TRUE(),
MAXX(_b,[Out Time])=MAX('Table'[In Time]),"follow",
MAXX(_b,[Out Time])>MAX('Table'[In Time]),"OVERLAP",
"After"),"---")
Output:
Kind Regards,
Bubble
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @raicardi ,
Could you tell me if your problem has been solved? If it is, kindly Accept it as the solution. More people will benefit from it. Or if you are still confused about it, please provide me with more details about your table and your problem or share me with your pbix file after removing sensitive data.
Refer to:
How to provide sample data in the Power BI Forum
How to Get Your Question Answered Quickly
Best Regards,
Jianbo Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @raicardi ,
Approve with @MahyarTF .
Here is my solution:
You can create index column in Power Query:
Here is the M code:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMtQ31DcyMDJS0lEytLC01DU00jU2UDAwtzIwACJUUUNTqGisDi6NhsZWxpgajQyhohCNRuTaiE0jUTYak2sjNo2EbIwFAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Work Date" = _t, #"In Time" = _t, #"Out Time" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Work Date", type date}, {"In Time", type datetime}, {"Out Time", type datetime}}),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1, Int64.Type)
in
#"Added Index"
Then create and apply the measure:
Measure =
var _a = MAX('Table'[Index])
var _b = FILTER(ALL('Table'),[Index]=_a-1)
return IF( MAXX(_b,'Table'[Work Date])=MAX('Table'[Work Date]),
SWITCH(TRUE(),
MAXX(_b,[Out Time])=MAX('Table'[In Time]),"follow",
MAXX(_b,[Out Time])>MAX('Table'[In Time]),"OVERLAP",
"After"),"---")
Output:
Kind Regards,
Bubble
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi,
- I create the column as an Index Column :
Thanks for the infomation. I realized i needed to group 20 distinct locations so the formula above was helpful but am stil unable to achieve the result im looing for. Columns and screenshots below. [DAX Index] below is the
Not sure, why it is not working,
Would you please share your sample PBIX file ?
I check my doing again and it is working
Let me know if this works?
Work Date | In Time | Out Time | Overlap | Order | |
1/1/2022 | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 1 | |
1/1/2022 | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 2 | |
1/2/2022 | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 3 | |
1/2/2022 | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 4 | |
1/3/2022 | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 5 | |
1/3/2022 | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 6 | |
1/4/2022 | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 7 | |
1/5/2022 | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 8 | |
1/6/2022 | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 9 | |
1/7/2022 | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 10 | |
1/7/2022 | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 11 | |
1/8/2022 | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 12 | |
1/9/2022 | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | OVERLAP | 13 | |
1/9/2022 | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | OVERLAP | 14 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 15 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 16 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 17 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 18 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 19 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 20 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 21 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 22 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 23 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 24 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 25 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 26 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 27 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 28 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 29 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 30 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 31 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 32 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 33 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 34 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 35 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 36 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 37 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 38 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 39 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 40 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 41 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 42 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 43 | |
######## | 1899-12-30 13:30:00 | 1899-12-30 21:30:00 | '--- | 44 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 21:30:00 | '--- | 45 | |
######## | 1899-12-30 15:00:00 | 1899-12-30 21:30:00 | '--- | 46 | |
######## | 1899-12-30 07:00:00 | 1899-12-30 15:00:00 | '--- | 47 |
Hi,
This is my result :
thanks. Still not sure what the issue is. I've included the entire data set with all 20 groups, each with their own corresponding index. Hoping this is my issue..
I hope so,
Please Kudos my posts, if those are help you
Do you think it's related to multiple indexes? If you could take a look at the data link below that would be amazing.
https://docs.google.com/spreadsheets/d/162cWchQU-t_Gc7o4wgUH4WXyzCa1Dhq4J3BLDjHbMZg/edit?usp=sharing
I think so, please use my dax for creating the Index column