Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowGet inspired! Check out the entries from the Power BI DataViz World Championships preliminary rounds and give kudos to your favorites. View the vizzies.
Hello!
I have the below 'BreakTable' with two separate date fields, a status field, and a distinct ID for each record. Additionally, I have a 'DateTable' that has each date which is present on my BreakTable.
I'm trying to build out a measure that will count my rows (or the distinct Break IDs) during the period that the break is considered open - between created and resolved date. The only status that has a resolved date will be 'RESOLVED'. Every other status is considered open and has a null value within the BREAK_RESOLUTION_DT column. I would expect those to continue to be included in the total going forward.
Any idea how I can accomplish this using a measure?
'BreakTable'
POSITION_BREAK_ID | BREAK_STATUS_CD | CREATE_DT_TM | BREAK_RESOLUTION_DT |
1 | RESOLVED | 6/11/2019 | 6/16/2019 |
2 | RESOLVED | 6/11/2019 | 6/17/2019 |
3 | RESOLVED | 6/13/2019 | 6/20/2019 |
4 | RESOLVED | 6/19/2019 | 6/21/2019 |
5 | OPEN | 6/20/2019 | |
6 | OPEN | 6/24/2019 | |
7 | RESEARCH | 6/27/2019 | |
8 | RESEARCH | 6/28/2019 |
Example chart
8 | 8 | ||||||||||||||||||
3 | 3 | 3 | 3 | 5 | 7 | 7 | 7 | ||||||||||||
2 | 2 | 2 | 2 | 2 | 2 | 3 | 4 | 4 | 5 | 6 | 6 | 6 | 6 | 6 | 6 | ||||
1 | 1 | 1 | 1 | 1 | 1 | 2 | 3 | 3 | 3 | 4 | 5 | 5 | 5 | 5 | 5 | 5 | 5 | 5 | |
6/11 | 6/12 | 6/13 | 6/14 | 6/15 | 6/16 | 6/17 | 6/18 | 6/19 | 6/20 | 6/21 | 6/22 | 6/23 | 6/24 | 6/25 | 6/26 | 6/27 | 6/28 | ….Today | |
Date |
Solved! Go to Solution.
Hi @Anonymous
Calculating this with measure would be very slow on a large data sets, I suggest you consider a different approach.
You can use Query editor to create a line for every day like ( M code ) below.
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUQpyDfb3CXN1ATINzfQNzPSNDAwtQRxDOCdWJ1rJCE2pOW6lxqhKjQyQlRqjKDVBU2qIrNQSRakpUMg/wNUPSKEaCpI0Q5U0QZE0h1ji6hjk7AFVYI6iwAJTgQVCQSwA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [POSITION_BREAK_ID = _t, BREAK_STATUS_CD = _t, BREAK_RESOLUTION_DT = _t, CREATE_DT_TM = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"POSITION_BREAK_ID", Int64.Type}, {"BREAK_STATUS_CD", type text}, {"BREAK_RESOLUTION_DT", type date}, {"CREATE_DT_TM", type date}}), #"Added Custom" = Table.AddColumn(#"Changed Type", "Date", each List.Dates( [CREATE_DT_TM], Duration.Days( ( if [BREAK_RESOLUTION_DT] = null then DateTime.Date( DateTime.LocalNow() ) else [BREAK_RESOLUTION_DT] ) - [CREATE_DT_TM] ), #duration( 1, 0, 0, 0 ) )), #"Expanded Date" = Table.ExpandListColumn(#"Added Custom", "Date"), #"Changed Type1" = Table.TransformColumnTypes(#"Expanded Date",{{"Date", type date}}), #"Removed Other Columns" = Table.SelectColumns(#"Changed Type1",{"POSITION_BREAK_ID", "Date", "BREAK_STATUS_CD"}) in #"Removed Other Columns"
Next step would be joining Date from your new table with date in your date dimension in the modeling tab and creating a Measure as below.
count = DISTINCTCOUNT( BreakTable[POSITION_BREAK_ID] )
Hope this make sense.
Hi @Anonymous
Calculating this with measure would be very slow on a large data sets, I suggest you consider a different approach.
You can use Query editor to create a line for every day like ( M code ) below.
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUQpyDfb3CXN1ATINzfQNzPSNDAwtQRxDOCdWJ1rJCE2pOW6lxqhKjQyQlRqjKDVBU2qIrNQSRakpUMg/wNUPSKEaCpI0Q5U0QZE0h1ji6hjk7AFVYI6iwAJTgQVCQSwA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [POSITION_BREAK_ID = _t, BREAK_STATUS_CD = _t, BREAK_RESOLUTION_DT = _t, CREATE_DT_TM = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"POSITION_BREAK_ID", Int64.Type}, {"BREAK_STATUS_CD", type text}, {"BREAK_RESOLUTION_DT", type date}, {"CREATE_DT_TM", type date}}), #"Added Custom" = Table.AddColumn(#"Changed Type", "Date", each List.Dates( [CREATE_DT_TM], Duration.Days( ( if [BREAK_RESOLUTION_DT] = null then DateTime.Date( DateTime.LocalNow() ) else [BREAK_RESOLUTION_DT] ) - [CREATE_DT_TM] ), #duration( 1, 0, 0, 0 ) )), #"Expanded Date" = Table.ExpandListColumn(#"Added Custom", "Date"), #"Changed Type1" = Table.TransformColumnTypes(#"Expanded Date",{{"Date", type date}}), #"Removed Other Columns" = Table.SelectColumns(#"Changed Type1",{"POSITION_BREAK_ID", "Date", "BREAK_STATUS_CD"}) in #"Removed Other Columns"
Next step would be joining Date from your new table with date in your date dimension in the modeling tab and creating a Measure as below.
count = DISTINCTCOUNT( BreakTable[POSITION_BREAK_ID] )
Hope this make sense.
Thanks for the idea, looks like this could be a solution.
One comment I do have is by using this approach it looks like I would have to pull my source data in twice if I ever wanted to utilize other metrics (since I'm now essentially duplicating reach across the date range). I also simplified the data table a bit, theres a decent amount of other columns that I am evaluating.
If I went the measure route then I wouldn't need to do that, correct? Trying to understand if it makes more sense to pull the data in two separate times, or create a long-running measure.
Thanks again for the help!
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code FABINSIDER for a $400 discount!
Check out the February 2025 Power BI update to learn about new features.
User | Count |
---|---|
89 | |
84 | |
66 | |
52 | |
31 |
User | Count |
---|---|
121 | |
113 | |
73 | |
65 | |
46 |