Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
El_Jefe
New Member

Group Counter that Resets to Zero when conditions are met

Please see the below sample data:

 

serviceaddress_idreadingdatereadingtype

consecutiveEstimate

804181/14/2024 12:00:00 PMEstimate1
8041812/14/2023 12:00:00 PMEstimate1
8041811/14/2023 12:00:00 PMEstimate1
8041810/14/2023 12:00:00 PMEstimate1
804189/14/2023 12:00:00 PMEstimate1
804291/10/2024 3:52:00 PMEstimate1
8042912/13/2023 6:39:00 PMEstimate1
8042911/14/2023 10:45:00 AMEstimate1
8042910/6/2023 12:02:00 PMActual0
804299/1/2023 9:51:00 AMActual0
804298/4/2023 8:44:34 AMActual0
804297/10/2023 11:47:53 AMActual0
804296/13/2023 9:43:32 AMActual0
804295/10/2023 11:36:00 AMActual0
804294/14/2023 12:00:00 PMEstimate0
804293/3/2023 12:00:00 PMEstimate0
804292/14/2023 12:00:00 PMEstimate0
8042912/31/2022 12:00:00 PMEstimate0
8042911/17/2022 12:00:00 AMActual0

 

The columns that I currently have are serviceaddress_id, readingdate and readingType.  Notice that the first instance for each serviceaddress_id has a readingtype of Estimate.  

 

The requirement is that the consecutiveEstimate value should be 1 for each grouping until the first row where the readingtype is Actual.  All subsequent rows, including an instance where it reverts back to Actual, should be 0.  Basically, I only want to count consecutive estimates that are currently consecutive.  I really hope that this makes sense.  I populated the sample data with what my expected results should be.  See now serviceaddress_id 80429 would contain values of 1 in just the first 3 rows even though we did have 4 consecutive estimates further down the in the data.  Those 4 consecutive estimates for that serviceaddress_id are not current.

 

I think that Power Query is where this should be done, but if it would be better in DAX I'm open to that.

 

Any help that can be provided would be greatly appreciated.  I have been banging my head on my desk for a couple of days and it's starting to hurt. 

4 ACCEPTED SOLUTIONS
v-jingzhan-msft
Community Support
Community Support

Hi @El_Jefe 

 

This is my solution with Power Query. You can use the attached sample file to see how it works. Or replace the source step with yours. 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI9C8IwEAbgv1IyC7ncXdrktg6OgnvpUMRB0Mn6/42m9kOMTSFTeLjce5emUQ7YOLVTRhvWCMiFQQEIpzgewv3+3l9uXX9W7W6GcdCUpc0mDVu0z8ToY0aIGUksrtgQkWLlUsiv4FlCELYvXac16HLqeeqjPvWP7rqgIV6UXqwZq/6ATg/vO2EW4iSshhmEx41wJZaStBwn4IVJCJPSzotS+a9R3rIw0pRtc7/kZ7v0nixm6bDe6ksv8rVP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"serviceaddress_id", Int64.Type}, {"readingdate", type datetime}, {"readingtype", type text}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"serviceaddress_id"}, {{"AllData", each _, type table [serviceaddress_id=nullable number, readingdate=nullable datetime, readingtype=nullable text]}}),
    Custom1 = Table.TransformColumns(#"Grouped Rows", {"AllData", each Table.FillDown(Table.AddColumn(Table.AddIndexColumn(_, "Index", 1, 1), "consecutiveEstimate", each if [readingtype] = "Estimate" and [Index] =  1 then 1 else if [readingtype] = "Actual" then 0 else null), {"consecutiveEstimate"})}),
    #"Removed Other Columns" = Table.SelectColumns(Custom1,{"AllData"}),
    #"Expanded AllData" = Table.ExpandTableColumn(#"Removed Other Columns", "AllData", {"serviceaddress_id", "readingdate", "readingtype", "Index", "consecutiveEstimate"}, {"serviceaddress_id", "readingdate", "readingtype", "Index", "consecutiveEstimate"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded AllData",{"Index"})
in
    #"Removed Columns"

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

View solution in original post

m_dekorte
Super User
Super User

Hi @El_Jefe 

 

A Group By operation is helpful in these types of requirements, alternatively you can try this.

Just create a new blank query, open the Advanced Editor and replace everyting inside it with this code.

let
    lookIn = List.Buffer(List.Zip({GroupRows[serviceaddress_id], GroupRows[readingtype]})),
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("lZI9C8MgEIb/SnAOeHqn0dsydCx0DxlC6VBop6b/v6aGfJUaAw4iz6P3ntc0wgEpJ0qhpCKpQVOhNAOEVVzO4fz06u/Prr8NiGjLhaBHA7MNddiAo4Y/IGgfc0PMjWx0Bh9iY3zBMvoMYZEamMxg1GkDpJ0zzDXV1/7dPcIGVniIHGnPRk23/4GdHGtxTMRISbgaexMKUUwVG0ziduqMZ0JGnaTN8nK0e4XT/seuBZR4iM8YZ9hOAn47r7ONMArVxvjJ3H4A", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t, consecutiveEstimate = _t]),
    GroupRows = Table.Group(Source, {"serviceaddress_id", "readingtype"}, {{"t", each _, type table [serviceaddress_id=nullable text, readingdate=nullable text, readingtype=nullable text, consecutiveEstimate=nullable text]}}, GroupKind.Local),
    addFirstOccurance = Table.RemoveColumns( Table.AddColumn(
        Table.AddIndexColumn(GroupRows, "Index", 0, 1), "consecutiveEst", each 
        [
            a = List.PositionOf( lookIn, {[serviceaddress_id], "Estimate"}),
            b = if a = [Index] then 1 else 0
        ][b], Int64.Type ),
        {"Index"}
    ),
    ExpandTable = Table.ExpandTableColumn(addFirstOccurance, "t", {"consecutiveEstimate"})
in
    ExpandTable

 I hope this is helpful

View solution in original post

dufoq3
Super User
Super User

Hi, @El_Jefe,

 

result:

dufoq3_0-1707557687376.png

 

v1

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI9C8IwEAbgv1IyC7ncXdrktg6OgnvpUMRB0Mn6/42m9kOMTSFTeLjce5emUQ7YOLVTRhvWCMiFQQEIpzgewv3+3l9uXX9W7W6GcdCUpc0mDVu0z8ToY0aIGUksrtgQkWLlUsiv4FlCELYvXac16HLqeeqjPvWP7rqgIV6UXqwZq/6ATg/vO2EW4iSshhmEx41wJZaStBwn4IVJCJPSzotS+a9R3rIw0pRtc7/kZ7v0nixm6bDe6ksv8rVP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
    GroupedRows = Table.Group(Source, {"serviceaddress_id"}, {
    {"All", each _, type table},
    {"Status", each Table.AddColumn(_, "consecutiveEstimate", (x)=> if x[readingtype] = "Actual" or List.Contains(Table.FirstN(_, (y)=> y[readingdate] <> x[readingdate])[readingtype], "Actual") then 0 else 1, Int64.Type), type table}
  }),
    CombinedStatus = Table.Combine(GroupedRows[Status])
in
    CombinedStatus

 

 

v2

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldKxDsIgEAbgV2mYmwB3B4XbOjiauDcdGuNgopP1/cVStVVpacJEvhz339E0winSTpRCS00SFFChgZUKpzjsw/3u1p+vXX8SbTnBMGrM0nqTVgldH/t7d5lZn1kYfEyoYkJkAys2BMRY2TL6FTzJp5jMU9e/HUerpP10DIlwAw3hovRs9FJNJ8fXHRMxUhJW4wTC45qpYoOR/k1l3xPwTMgIybJmWhbtUqu0ZWEoMdvmfsjXdnGYLWTpsN7qS8/ytQ8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
    ConsecutiveEstimate = 
        [ 
            t = Table.Buffer(Source),
            zip = List.Zip({List.Distinct(t[serviceaddress_id]), List.Repeat({"Actual"}, List.Count(List.Distinct(t[serviceaddress_id])))}),
            lg = 
                List.Generate( 
                    ()=> [ 
                            x = 0, 
                            row = Record.ToList(t{x}[[serviceaddress_id], [readingtype]]),
                            check = List.Select(if List.ContainsAny(zip, row) then {row} else {{}}, each _ <> {}),
                            result = if List.Contains(List.Combine(check), row{0}) then 0 else 1
                         ],
                    each [x] < Table.RowCount(t),
                    each [ 
                            x = [x]+1,
                            row = Record.ToList(t{x}[[serviceaddress_id], [readingtype]]),
                            check = List.Select(if List.ContainsAny(zip, {row}) and not List.ContainsAny([check], {row}) then [check] & {row} else [check], each _ <> {}),
                            result = if List.Contains(List.Combine(check), row{0}) then 0 else 1
                         ],
                    each [result]
                )
        ][lg],
    Merged = Table.FromColumns(Table.ToColumns(Source) & {ConsecutiveEstimate}, Table.ColumnNames(Source) & {"Consecutive Estimate"})
in
    Merged

 


Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

View solution in original post

AlienSx
Super User
Super User

@El_Jefe 

let
    Source = your_table,
    f = (tbl as table) => 
        [recs = Table.ToRecords(tbl),
        gen = List.Generate(
            () => [i = 0, c = recs{0}, sta = Byte.From(c[readingtype] = "Estimate"), r = c & [consecutiveEstimate = sta]],
            (x) => x[i] < List.Count(recs),
            (x) => 
                [i = x[i] + 1, 
                c = recs{i}, 
                sta = Byte.From(c[readingtype] = "Estimate" and x[sta] = 1), 
                r = c & [consecutiveEstimate = sta]],
            (x) => x[r]
        ),
        out = Table.FromRecords(gen)][out],
    gr = Table.Group(Source, "serviceaddress_id", {{"all", each f(Table.Sort(_, {"readingdate", Order.Descending}))}}),
    expand = Table.ExpandTableColumn(gr, "all", {"readingdate", "readingtype", "consecutiveEstimate"})
in
    expand

View solution in original post

5 REPLIES 5
AlienSx
Super User
Super User

@El_Jefe 

let
    Source = your_table,
    f = (tbl as table) => 
        [recs = Table.ToRecords(tbl),
        gen = List.Generate(
            () => [i = 0, c = recs{0}, sta = Byte.From(c[readingtype] = "Estimate"), r = c & [consecutiveEstimate = sta]],
            (x) => x[i] < List.Count(recs),
            (x) => 
                [i = x[i] + 1, 
                c = recs{i}, 
                sta = Byte.From(c[readingtype] = "Estimate" and x[sta] = 1), 
                r = c & [consecutiveEstimate = sta]],
            (x) => x[r]
        ),
        out = Table.FromRecords(gen)][out],
    gr = Table.Group(Source, "serviceaddress_id", {{"all", each f(Table.Sort(_, {"readingdate", Order.Descending}))}}),
    expand = Table.ExpandTableColumn(gr, "all", {"readingdate", "readingtype", "consecutiveEstimate"})
in
    expand

Hi @AlienSx, result is not correct. @El_Jefe asked for different one.

EDIT: your formula works with Chaned type for readingdate.


Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

dufoq3
Super User
Super User

Hi, @El_Jefe,

 

result:

dufoq3_0-1707557687376.png

 

v1

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI9C8IwEAbgv1IyC7ncXdrktg6OgnvpUMRB0Mn6/42m9kOMTSFTeLjce5emUQ7YOLVTRhvWCMiFQQEIpzgewv3+3l9uXX9W7W6GcdCUpc0mDVu0z8ToY0aIGUksrtgQkWLlUsiv4FlCELYvXac16HLqeeqjPvWP7rqgIV6UXqwZq/6ATg/vO2EW4iSshhmEx41wJZaStBwn4IVJCJPSzotS+a9R3rIw0pRtc7/kZ7v0nixm6bDe6ksv8rVP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
    GroupedRows = Table.Group(Source, {"serviceaddress_id"}, {
    {"All", each _, type table},
    {"Status", each Table.AddColumn(_, "consecutiveEstimate", (x)=> if x[readingtype] = "Actual" or List.Contains(Table.FirstN(_, (y)=> y[readingdate] <> x[readingdate])[readingtype], "Actual") then 0 else 1, Int64.Type), type table}
  }),
    CombinedStatus = Table.Combine(GroupedRows[Status])
in
    CombinedStatus

 

 

v2

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldKxDsIgEAbgV2mYmwB3B4XbOjiauDcdGuNgopP1/cVStVVpacJEvhz339E0winSTpRCS00SFFChgZUKpzjsw/3u1p+vXX8SbTnBMGrM0nqTVgldH/t7d5lZn1kYfEyoYkJkAys2BMRY2TL6FTzJp5jMU9e/HUerpP10DIlwAw3hovRs9FJNJ8fXHRMxUhJW4wTC45qpYoOR/k1l3xPwTMgIybJmWhbtUqu0ZWEoMdvmfsjXdnGYLWTpsN7qS8/ytQ8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
    ConsecutiveEstimate = 
        [ 
            t = Table.Buffer(Source),
            zip = List.Zip({List.Distinct(t[serviceaddress_id]), List.Repeat({"Actual"}, List.Count(List.Distinct(t[serviceaddress_id])))}),
            lg = 
                List.Generate( 
                    ()=> [ 
                            x = 0, 
                            row = Record.ToList(t{x}[[serviceaddress_id], [readingtype]]),
                            check = List.Select(if List.ContainsAny(zip, row) then {row} else {{}}, each _ <> {}),
                            result = if List.Contains(List.Combine(check), row{0}) then 0 else 1
                         ],
                    each [x] < Table.RowCount(t),
                    each [ 
                            x = [x]+1,
                            row = Record.ToList(t{x}[[serviceaddress_id], [readingtype]]),
                            check = List.Select(if List.ContainsAny(zip, {row}) and not List.ContainsAny([check], {row}) then [check] & {row} else [check], each _ <> {}),
                            result = if List.Contains(List.Combine(check), row{0}) then 0 else 1
                         ],
                    each [result]
                )
        ][lg],
    Merged = Table.FromColumns(Table.ToColumns(Source) & {ConsecutiveEstimate}, Table.ColumnNames(Source) & {"Consecutive Estimate"})
in
    Merged

 


Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

m_dekorte
Super User
Super User

Hi @El_Jefe 

 

A Group By operation is helpful in these types of requirements, alternatively you can try this.

Just create a new blank query, open the Advanced Editor and replace everyting inside it with this code.

let
    lookIn = List.Buffer(List.Zip({GroupRows[serviceaddress_id], GroupRows[readingtype]})),
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("lZI9C8MgEIb/SnAOeHqn0dsydCx0DxlC6VBop6b/v6aGfJUaAw4iz6P3ntc0wgEpJ0qhpCKpQVOhNAOEVVzO4fz06u/Prr8NiGjLhaBHA7MNddiAo4Y/IGgfc0PMjWx0Bh9iY3zBMvoMYZEamMxg1GkDpJ0zzDXV1/7dPcIGVniIHGnPRk23/4GdHGtxTMRISbgaexMKUUwVG0ziduqMZ0JGnaTN8nK0e4XT/seuBZR4iM8YZ9hOAn47r7ONMArVxvjJ3H4A", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t, consecutiveEstimate = _t]),
    GroupRows = Table.Group(Source, {"serviceaddress_id", "readingtype"}, {{"t", each _, type table [serviceaddress_id=nullable text, readingdate=nullable text, readingtype=nullable text, consecutiveEstimate=nullable text]}}, GroupKind.Local),
    addFirstOccurance = Table.RemoveColumns( Table.AddColumn(
        Table.AddIndexColumn(GroupRows, "Index", 0, 1), "consecutiveEst", each 
        [
            a = List.PositionOf( lookIn, {[serviceaddress_id], "Estimate"}),
            b = if a = [Index] then 1 else 0
        ][b], Int64.Type ),
        {"Index"}
    ),
    ExpandTable = Table.ExpandTableColumn(addFirstOccurance, "t", {"consecutiveEstimate"})
in
    ExpandTable

 I hope this is helpful

v-jingzhan-msft
Community Support
Community Support

Hi @El_Jefe 

 

This is my solution with Power Query. You can use the attached sample file to see how it works. Or replace the source step with yours. 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI9C8IwEAbgv1IyC7ncXdrktg6OgnvpUMRB0Mn6/42m9kOMTSFTeLjce5emUQ7YOLVTRhvWCMiFQQEIpzgewv3+3l9uXX9W7W6GcdCUpc0mDVu0z8ToY0aIGUksrtgQkWLlUsiv4FlCELYvXac16HLqeeqjPvWP7rqgIV6UXqwZq/6ATg/vO2EW4iSshhmEx41wJZaStBwn4IVJCJPSzotS+a9R3rIw0pRtc7/kZ7v0nixm6bDe6ksv8rVP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"serviceaddress_id", Int64.Type}, {"readingdate", type datetime}, {"readingtype", type text}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"serviceaddress_id"}, {{"AllData", each _, type table [serviceaddress_id=nullable number, readingdate=nullable datetime, readingtype=nullable text]}}),
    Custom1 = Table.TransformColumns(#"Grouped Rows", {"AllData", each Table.FillDown(Table.AddColumn(Table.AddIndexColumn(_, "Index", 1, 1), "consecutiveEstimate", each if [readingtype] = "Estimate" and [Index] =  1 then 1 else if [readingtype] = "Actual" then 0 else null), {"consecutiveEstimate"})}),
    #"Removed Other Columns" = Table.SelectColumns(Custom1,{"AllData"}),
    #"Expanded AllData" = Table.ExpandTableColumn(#"Removed Other Columns", "AllData", {"serviceaddress_id", "readingdate", "readingtype", "Index", "consecutiveEstimate"}, {"serviceaddress_id", "readingdate", "readingtype", "Index", "consecutiveEstimate"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded AllData",{"Index"})
in
    #"Removed Columns"

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

Helpful resources

Announcements
Sept PBI Carousel

Power BI Monthly Update - September 2024

Check out the September 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Sept NL Carousel

Fabric Community Update - September 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors