Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
micpete
Helper III
Helper III

Consecutive days - reset to 1 if not consecutive

Hello,

 

I need a formula to work out consecutive days a person has come into work, I've been trying to look for this for ages and can't figure it out, I have a list of individuals with a date they have come into work, I need to work out that if a person has come to work over 12 days within a 14 day period then I need to flag that period red. The only issue is is that a person could have a date displayed twice because they have signed into different office within the same site for that day. Any day that they haven't signed in - they date does not show.

 

Please see table below - the ID references a person, so this ID is for the same person as its the same number, you can see the consecutive days have skipped a few days and this is where my issue is at the moment. And you can see the formula I used below the table:

 

micpete_0-1724058500815.png

Consecutive Days 2 =
VAR CurrentOperative = 'bi came_on_site_facts'[operative_id]
VAR CurrentDate = 'bi came_on_site_facts'[Date]
VAR Last_1_date =
        MAXX(FILTER('bi came_on_site_facts',
       'bi came_on_site_facts'[operative_id]= CurrentOperative
       && 'bi came_on_site_facts'[Date] <= CurrentDate
       && NOT 'bi came_on_site_facts'[First Day]
        ),
        'bi came_on_site_facts'[Date]
    )
RETURN
IF('bi came_on_site_facts'[First Day],
COUNTROWS(FILTER('bi came_on_site_facts',
'bi came_on_site_facts'[operative_id] = CurrentOperative
&& 'bi came_on_site_facts'[Date] <=CurrentDate
&& 'bi came_on_site_facts'[Date]> Last_1_date && 'bi came_on_site_facts'[First Day]
)
)
)
38 REPLIES 38
micpete
Helper III
Helper III

Hi @lbendlin  are you able to help me? I've been stuck on this for too long, I need the dates to be filter by the operative ID but I require the count of rows to be by unique date.

 

 

let
  Source = Table.FromRows(
    Json.Document(
      Binary.Decompress(
        Binary.FromText(
          "tdI7CsAgEEXRvVgrOON/C9mCuIV02X8gjU+FQMjYDZzqMbdWRZbIWa+0Im+O6zT83KrpwUI3ni11C1KWu2UpK7BvGbEH2QLOI38gAaZPyPBIK4fQDjs5xOyiHEbAOZN3hGbd8uxNCE27pYRNWEZsNw==", 
          BinaryEncoding.Base64
        ), 
        Compression.Deflate
      )
    ), 
    let
      _t = ((type nullable text) meta [Serialized.Text = true])
    in
      type table [operative_id = _t, Date = _t, #"Consec test" = _t]
  ), 
  #"Changed Type" = Table.TransformColumnTypes(Source, {{"Date", type date}}), 
  #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type), 
  #"Added Custom" = Table.AddColumn(
    #"Added Index", 
    "Consec", 
    each List.Accumulate(
      {0 .. [Index]}, 
      1, 
      (state, current) =>
        if current = 0 or #"Added Index"[Date]{current} = #"Added Index"[Date]{current - 1} then
          state
        else if #"Added Index"[Date]{current}
          = #"Added Index"[Date]{current - 1} + #duration(1, 0, 0, 0)
        then
          state + 1
        else
          1
    ), 
    Int64.Type
  )
in
  #"Added Custom"

 

 

How to use this code: Create a new Blank Query. Click on "Advanced Editor". Replace the code in the window with the code provided here. Click "Done". Once you examined the code, replace the Source step with your own source.

Hi @lbendlin Does this index according to the operative ID filtered, the operative ID is unique to the individual that has been on site, so the index needs to reflect this, example op ID 1 could have been on site 23/08/24 and 24/08/24 and op ID 2 could have also been on site 23/08/24 and 24/08/24 so the index would need to be 1, 2 respectively for op ID 1 and also 1, 2 respectively for Op ID 2

Please provide sample data that fully covers your issue.
Please show the expected outcome based on the sample data you provided.

Hi @lbendlin I provided the sample data in the other post with the dax I used to filter operative ID, does this suffice, I need date to index by operative ID:

 

Consec test =
VAR CurrentOperative = 'bi came_on_site_facts'[operative_id]
VAR CurrentDate = 'bi came_on_site_facts'[Date]
VAR DateRange =
FILTER('bi came_on_site_facts',
'bi came_on_site_facts'[operative_id] = CurrentOperative &&
'bi came_on_site_facts'[Date] >= CurrentDate - 13 &&
'bi came_on_site_facts'[Date] <= CurrentDate
)
VAR ConsecutiveDays =
COUNTROWS(
    FILTER(
        DateRange,
        DATEDIFF(
            MINX(DateRange'bi came_on_site_facts'[Date]), 'bi came_on_site_facts'[Date],
            DAY
        ) <=13
    )
)
RETURN
ConsecutiveDays

 

operative_idDateConsec test
101130414-Jun-241
101130415-Jun-242
101130417-Jun-245
101130417-Jun-245
101130417-Jun-245
101130418-Jun-248
101130418-Jun-248
101130418-Jun-248
101130419-Jun-2412
101130419-Jun-2412
101130419-Jun-2412
101130419-Jun-2412
101130420-Jun-2415
101130420-Jun-2415
101130420-Jun-2415
101130421-Jun-2417
101130421-Jun-2417
101130422-Jun-2420
101130422-Jun-2420
101130422-Jun-2420
101130424-Jun-2423
101130424-Jun-2423
101130424-Jun-2423
101130425-Jun-2426
101130425-Jun-2426
101130425-Jun-2426
101130426-Jun-2428
101130426-Jun-2428
101130427-Jun-2432
101130427-Jun-2432
101130427-Jun-2432
101130427-Jun-2432
101130428-Jun-2435
101130428-Jun-2435
101130428-Jun-2435
101130428-Jun-2435
101130429-Jun-24

35

Hi @lbendlin 

I've just added another OP ID - we have 1000s of operatives so I just made up an OP ID

 

operative_idDateConsec test
101130414-Jun-241
101130415-Jun-242
101130417-Jun-245
101130417-Jun-245
101130417-Jun-245
101130418-Jun-248
101130418-Jun-248
101130418-Jun-248
101130419-Jun-2412
101130419-Jun-2412
101130419-Jun-2412
101130419-Jun-2412
101130420-Jun-2415
101130420-Jun-2415
101130420-Jun-2415
101130421-Jun-2417
101130421-Jun-2417
101130422-Jun-2420

1011304

22-Jun-2420

 

  
101130422-Jun-2420
101130424-Jun-2423
101130424-Jun-2423
101130424-Jun-2423
101130425-Jun-2426
101130425-Jun-2426
101130425-Jun-2426
101130426-Jun-2428
101130426-Jun-2428
101130427-Jun-2432
101130427-Jun-2432
101130427-Jun-2432
101130427-Jun-2432
101130428-Jun-2435
101130428-Jun-2435
101130428-Jun-2435
101130428-Jun-2435
101130429-Jun-24

35

101130629-Jun-24

1

1011309 29-Jun-24

1

Hi @lbendlin,

 

We also have 100000s of rows of this data, and it's taking a very long time with the query you sent - is there a way to make this more efficient - currently hours to update

I'll say it one more time:  Please provide sample data that fully covers your issue.

Hi @lbendlin,

 

That is a sample of the data that fully covers my issue - as mentioned before I need a formula to work out consecutive days a person has come into work, I've been trying to look for this for ages and can't figure it out, I have a list of individuals with a date they have come into work, I need to work out that if a person has come to work over 12 days within a 14 day period then I need to flag that period red. The only issue is is that a person could have a date displayed twice because they have signed into different office within the same site for that day. Any day that they haven't signed in - they date does not show.

 

I have provided data with a different Operative ID, data with the same Date and the result that I got as well using the dax I had previously.

 

As also mentioned there are 1000s operatives on site each with unique IDs, I cannot put all the IDs here because it's too many so the data that I have provided should suffice.

 

The query you have sent me is taking too long to load the 100000s of rows as dates go back to 2020 and operative IDs are in the 1000s, the sample data I have already provided should suffice + the explanation I have given 4/5 times now. Please let me know what else is required?

At a minimum I would like to see a couple of IDs, each with a combination of contiguous and non-contiguous dates.  I won't be able to help without meaningful sample data.

Hi @lbendlin 

 

Please see below, the consect test is the result I get using the dax I've shown previously - it's wrong though as it is filtering the index by Operative ID but because some operative IDs might have entered site twice in one day I don't want the index to count those days again. I also need to know when an operative has done over 12 days in a 14 day period which I haven't been able to figure out.

 

operative_idDateConsec test
677125-Mar-211
677101-Apr-212
677107-Apr-213
101130414-Jun-241
101130415-Jun-242
101130417-Jun-245
101130417-Jun-245
101130417-Jun-245
101130418-Jun-248
101130418-Jun-248
101130418-Jun-248
101130419-Jun-2412
101130419-Jun-2412
101130419-Jun-2412
101130419-Jun-2412
101130420-Jun-2415
101130420-Jun-2415
101130420-Jun-2415
101130421-Jun-2417
101130421-Jun-2417
101130422-Jun-2420
101130422-Jun-2420
101130422-Jun-2420
101130424-Jun-2423
101130424-Jun-2423
101130424-Jun-2423
101130425-Jun-2426
101130425-Jun-2426
101130425-Jun-2426
101130426-Jun-2428
101130426-Jun-2428
101130427-Jun-2432
101130427-Jun-2432
101130427-Jun-2432
101130427-Jun-2432
101130428-Jun-2435
101130428-Jun-2435
101130428-Jun-2435
101130428-Jun-2435
101130429-Jun-2435
101130401-Jul-2434
101130401-Jul-2434
101130402-Jul-2434
101130402-Jul-2434
101130402-Jul-2434
101130403-Jul-2433
101130403-Jul-2433
101130403-Jul-2433
101130404-Jul-2433
101130404-Jul-2433
101130404-Jul-2433
101130405-Jul-2433
101130405-Jul-2433
101130406-Jul-2431
101130408-Jul-2431
101130408-Jul-2431
101130408-Jul-2431
101130409-Jul-2431
101130409-Jul-2431
101130409-Jul-2431
101130410-Jul-2432
101130410-Jul-2432
101130410-Jul-2432
101130411-Jul-2431
101130411-Jul-2431
101130411-Jul-2431
101130412-Jul-2430
101130412-Jul-2430
101130412-Jul-2430
101130413-Jul-2431
101130413-Jul-2431
101130415-Jul-2431
101130415-Jul-2431
101130416-Jul-2432
101130416-Jul-2432
101130416-Jul-2432
101130416-Jul-2432
101130417-Jul-2432
101130417-Jul-2432
101130417-Jul-2432
101130418-Jul-2432
101130418-Jul-2432
101130418-Jul-2432
101130419-Jul-2434
101130419-Jul-2434
101130419-Jul-2434
101130419-Jul-2434
101762727-Jun-241
101762702-Jul-242
101762703-Jul-243
101762704-Jul-244
101762708-Jul-245
101762709-Jul-247
101762709-Jul-247
101762710-Jul-248
101762715-Jul-248
101762716-Jul-248
101762718-Jul-248
101762718-Jul-248
101762722-Jul-248
101762723-Jul-248
101762723-Jul-248
101762724-Jul-249
101762724-Jul-249
101762725-Jul-2410
101762726-Jul-2411
101762730-Jul-2411
101762730-Jul-2411
101762731-Jul-2414
101762731-Jul-2414
101762731-Jul-2414
101762701-Aug-2413
101762705-Aug-2413
101762706-Aug-2413
101762706-Aug-2413
101762707-Aug-2412
77051601-Jul-2415
77051601-Jul-2415
77051602-Jul-2416
77051603-Jul-2415
77051604-Jul-2415
77051604-Jul-2415
77051605-Jul-2415
77051609-Jul-2413
77051609-Jul-2413
77051610-Jul-2413
77051610-Jul-2413
77051611-Jul-2413
77051612-Jul-2413
77051615-Jul-2412
77051616-Jul-2412
77051617-Jul-2412
77051618-Jul-2411
77051629-Jul-244
77051630-Jul-244
77051631-Jul-245
77051631-Jul-245
77051601-Aug-246
77051601-Aug-246
77051602-Aug-247
77051605-Aug-249
77051605-Aug-249
77051606-Aug-2410

Hi @lbendlin do you know if there is anyway I can do this?

Here is the slow version (iterating through the entire list for each row)

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("tda7bsMwDAXQf/EcASL1oD12DZAvCDJkylIURYH8fxKgimOZFwYbdj0TH1e0j8ehitCwG7iEw/knMA2n3RMjhY/vFcorUiRKMd+dcthfvwLn3gtwcfLRyaf/dY5OTkZnJwf7NTvIg9mr0UF+3Bzkys1BruJj758GZydPTp6dvBi9Ah+dfPJxik4O8mB2kAezgzxAB3uEDvbr5uLkID9mR/l516WyaHfv11f3oXm/x+b9+23e99Uc1IN8lf/mfU6a9/ttDupBzmAODOYAHcwHOuiLQV8JzAd6fwf+6I+fyetF8QK8Gl1eXSQWqqvP4CazzknnbOKi82ThZbo3We+S9C5JL3D5QGYWnUeVWW8n6XUnvW7Ay2RtMutcTDyH8HQD", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [operative_id = _t, Date = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type date}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "Consec", each List.Accumulate({0..[Index]},1,(state,current)=> if current = 0 or #"Added Index"[operative_id]{current}<>#"Added Index"[operative_id]{current-1} then 1 else if #"Added Index"[Date]{current}=#"Added Index"[Date]{current-1} then state else if #"Added Index"[Date]{current}=#"Added Index"[Date]{current-1}+#duration(1,0,0,0) then state+1 else 1),Int64.Type)
in
    #"Added Custom"

 

A slightly faster version will be to group by operative_id and then run the process for each group separately.  Do you want me to show that too?

Yes please if you could show me as well

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("tda7bsMwDAXQf/EcASL1oD12DZAvCDJkylIURYH8fxKgimOZFwYbdj0TH1e0j8ehitCwG7iEw/knMA2n3RMjhY/vFcorUiRKMd+dcthfvwLn3gtwcfLRyaf/dY5OTkZnJwf7NTvIg9mr0UF+3Bzkys1BruJj758GZydPTp6dvBi9Ah+dfPJxik4O8mB2kAezgzxAB3uEDvbr5uLkID9mR/l516WyaHfv11f3oXm/x+b9+23e99Uc1IN8lf/mfU6a9/ttDupBzmAODOYAHcwHOuiLQV8JzAd6fwf+6I+fyetF8QK8Gl1eXSQWqqvP4CazzknnbOKi82ThZbo3We+S9C5JL3D5QGYWnUeVWW8n6XUnvW7Ay2RtMutcTDyH8HQD", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [operative_id = _t, Date = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type date}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"operative_id"}, {{"Rows", each Table.AddIndexColumn(Table.SelectColumns(_,{"Date"}), "Index", 0, 1, Int64.Type), type table [Index=Int64.Type, Date=nullable date]}}),
    #"Added Custom1" = Table.AddColumn(#"Grouped Rows", "Custom", (k)=> Table.AddColumn(k[Rows], "Consec", each List.Accumulate({0..[Index]},1,(state,current)=> if current = 0 or k[Rows][Date]{current}=k[Rows][Date]{current-1} then state else if k[Rows][Date]{current}=k[Rows][Date]{current-1}+#duration(1,0,0,0) then state+1 else 1),Int64.Type)),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Rows"}),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns", "Custom", {"Date", "Consec"}, {"Date", "Consec"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Expanded Custom",{{"Date", type date}, {"Consec", Int64.Type}})
in
    #"Changed Type1"

Hi @lbendlin is there a way where if the index reaches 14 it resets back to 1 as well in addition to all the other conditions?

yes you can add that as a condition

 

 

 

if current = 0 or k[Rows][Date]{current}=k[Rows][Date]{current-1} then state else if k[Rows][Date]{current}=k[Rows][Date]{current-1}+#duration(1,0,0,0) and state<14 then state+1 else 1

 

 

Hi @lbendlin thank you, so I think that will work but the problem is now, I can't seem to load the data into my model - I just get this ongoing, I've left it for an hour and still nothing, it's loaded all the rows quickly and I can see that those are all the rows loaded but it won't place them into the model, I just get this screen.

 

micpete_0-1724403256782.png

 

Hi @lbendlin I think it might be something to do with the grouping? Whenever I remove the steps I can then load the data, I've tried removing cache and amending some of the settings but still doesn't work. 😞

Read about Table.Buffer - that will help if you have enough RAM.

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors