Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The FabCon + SQLCon recap series starts April 14th at 8am Pacific. If you’re tracking where AI is going inside Fabric, this first session is a can't miss. Register now

Reply
Anonymous
Not applicable

Measure for a Conditional Running Total Between Two Date Fields

Hello!

I have the below 'BreakTable' with two separate date fields, a status field, and a distinct ID for each record. Additionally, I have a 'DateTable' that has each date which is present on my BreakTable.

 

I'm trying to build out a measure that will count my rows (or the distinct Break IDs) during the period that the break is considered open - between created and resolved date. The only status that has a resolved date will be 'RESOLVED'. Every other status is considered open and has a null value within the BREAK_RESOLUTION_DT column. I would expect those to continue to be included in the total going forward.

 

Any idea how I can accomplish this using a measure? 

 

'BreakTable'

POSITION_BREAK_IDBREAK_STATUS_CDCREATE_DT_TMBREAK_RESOLUTION_DT
1RESOLVED6/11/20196/16/2019
2RESOLVED6/11/20196/17/2019
3RESOLVED6/13/20196/20/2019
4RESOLVED6/19/20196/21/2019
5OPEN6/20/2019 
6OPEN6/24/2019 
7RESEARCH6/27/2019 
8RESEARCH6/28/2019 

 

 

 

Example chart

                    
                 88 
  3333   5      777 
2222223 445  666666 
1111112333455555555 
6/116/126/136/146/156/166/176/186/196/206/216/226/236/246/256/266/276/28….Today
                    
         Date          
1 ACCEPTED SOLUTION
Mariusz
Community Champion
Community Champion

Hi @Anonymous 

 

Calculating this with measure would be very slow on a large data sets, I suggest you consider a different approach.

You can use Query editor to create a line for every day like ( M code ) below.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUQpyDfb3CXN1ATINzfQNzPSNDAwtQRxDOCdWJ1rJCE2pOW6lxqhKjQyQlRqjKDVBU2qIrNQSRakpUMg/wNUPSKEaCpI0Q5U0QZE0h1ji6hjk7AFVYI6iwAJTgQVCQSwA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [POSITION_BREAK_ID = _t, BREAK_STATUS_CD = _t, BREAK_RESOLUTION_DT = _t, CREATE_DT_TM = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"POSITION_BREAK_ID", Int64.Type}, {"BREAK_STATUS_CD", type text}, {"BREAK_RESOLUTION_DT", type date}, {"CREATE_DT_TM", type date}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Date", each List.Dates( [CREATE_DT_TM], Duration.Days( ( if [BREAK_RESOLUTION_DT] = null then DateTime.Date( DateTime.LocalNow() ) else [BREAK_RESOLUTION_DT] ) - [CREATE_DT_TM] ), #duration( 1, 0, 0, 0 ) )),
    #"Expanded Date" = Table.ExpandListColumn(#"Added Custom", "Date"),
    #"Changed Type1" = Table.TransformColumnTypes(#"Expanded Date",{{"Date", type date}}),
    #"Removed Other Columns" = Table.SelectColumns(#"Changed Type1",{"POSITION_BREAK_ID", "Date", "BREAK_STATUS_CD"})
in
    #"Removed Other Columns"

Next step would be joining Date from your new table with date in your date dimension in the modeling tab  and  creating a Measure as below. 

count = DISTINCTCOUNT( BreakTable[POSITION_BREAK_ID] )

Hope this make sense.

 

Best Regards,
Mariusz

If this post helps, then please consider Accepting it as the solution.

Please feel free to connect with me.
Mariusz Repczynski




View solution in original post

2 REPLIES 2
Mariusz
Community Champion
Community Champion

Hi @Anonymous 

 

Calculating this with measure would be very slow on a large data sets, I suggest you consider a different approach.

You can use Query editor to create a line for every day like ( M code ) below.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUQpyDfb3CXN1ATINzfQNzPSNDAwtQRxDOCdWJ1rJCE2pOW6lxqhKjQyQlRqjKDVBU2qIrNQSRakpUMg/wNUPSKEaCpI0Q5U0QZE0h1ji6hjk7AFVYI6iwAJTgQVCQSwA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [POSITION_BREAK_ID = _t, BREAK_STATUS_CD = _t, BREAK_RESOLUTION_DT = _t, CREATE_DT_TM = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"POSITION_BREAK_ID", Int64.Type}, {"BREAK_STATUS_CD", type text}, {"BREAK_RESOLUTION_DT", type date}, {"CREATE_DT_TM", type date}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Date", each List.Dates( [CREATE_DT_TM], Duration.Days( ( if [BREAK_RESOLUTION_DT] = null then DateTime.Date( DateTime.LocalNow() ) else [BREAK_RESOLUTION_DT] ) - [CREATE_DT_TM] ), #duration( 1, 0, 0, 0 ) )),
    #"Expanded Date" = Table.ExpandListColumn(#"Added Custom", "Date"),
    #"Changed Type1" = Table.TransformColumnTypes(#"Expanded Date",{{"Date", type date}}),
    #"Removed Other Columns" = Table.SelectColumns(#"Changed Type1",{"POSITION_BREAK_ID", "Date", "BREAK_STATUS_CD"})
in
    #"Removed Other Columns"

Next step would be joining Date from your new table with date in your date dimension in the modeling tab  and  creating a Measure as below. 

count = DISTINCTCOUNT( BreakTable[POSITION_BREAK_ID] )

Hope this make sense.

 

Best Regards,
Mariusz

If this post helps, then please consider Accepting it as the solution.

Please feel free to connect with me.
Mariusz Repczynski




Anonymous
Not applicable

Thanks for the idea, looks like this could be a solution.

 

One comment I do have is by using this approach it looks like I would have to pull my source data in twice if I ever wanted to utilize other metrics (since I'm now essentially duplicating reach across the date range). I also simplified the data table a bit, theres a decent amount of other columns that I am evaluating.

 

If I went the measure route then I wouldn't need to do that, correct? Trying to understand if it makes more sense to pull the data in two separate times, or create a long-running measure.

 

Thanks again for the help!

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Power BI Update Carousel

Power BI Community Update - March 2026

Check out the March 2026 Power BI update to learn about new features.