Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Please see the below sample data:
serviceaddress_id | readingdate | readingtype | consecutiveEstimate |
80418 | 1/14/2024 12:00:00 PM | Estimate | 1 |
80418 | 12/14/2023 12:00:00 PM | Estimate | 1 |
80418 | 11/14/2023 12:00:00 PM | Estimate | 1 |
80418 | 10/14/2023 12:00:00 PM | Estimate | 1 |
80418 | 9/14/2023 12:00:00 PM | Estimate | 1 |
80429 | 1/10/2024 3:52:00 PM | Estimate | 1 |
80429 | 12/13/2023 6:39:00 PM | Estimate | 1 |
80429 | 11/14/2023 10:45:00 AM | Estimate | 1 |
80429 | 10/6/2023 12:02:00 PM | Actual | 0 |
80429 | 9/1/2023 9:51:00 AM | Actual | 0 |
80429 | 8/4/2023 8:44:34 AM | Actual | 0 |
80429 | 7/10/2023 11:47:53 AM | Actual | 0 |
80429 | 6/13/2023 9:43:32 AM | Actual | 0 |
80429 | 5/10/2023 11:36:00 AM | Actual | 0 |
80429 | 4/14/2023 12:00:00 PM | Estimate | 0 |
80429 | 3/3/2023 12:00:00 PM | Estimate | 0 |
80429 | 2/14/2023 12:00:00 PM | Estimate | 0 |
80429 | 12/31/2022 12:00:00 PM | Estimate | 0 |
80429 | 11/17/2022 12:00:00 AM | Actual | 0 |
The columns that I currently have are serviceaddress_id, readingdate and readingType. Notice that the first instance for each serviceaddress_id has a readingtype of Estimate.
The requirement is that the consecutiveEstimate value should be 1 for each grouping until the first row where the readingtype is Actual. All subsequent rows, including an instance where it reverts back to Actual, should be 0. Basically, I only want to count consecutive estimates that are currently consecutive. I really hope that this makes sense. I populated the sample data with what my expected results should be. See now serviceaddress_id 80429 would contain values of 1 in just the first 3 rows even though we did have 4 consecutive estimates further down the in the data. Those 4 consecutive estimates for that serviceaddress_id are not current.
I think that Power Query is where this should be done, but if it would be better in DAX I'm open to that.
Any help that can be provided would be greatly appreciated. I have been banging my head on my desk for a couple of days and it's starting to hurt.
Solved! Go to Solution.
Hi @El_Jefe
This is my solution with Power Query. You can use the attached sample file to see how it works. Or replace the source step with yours.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI9C8IwEAbgv1IyC7ncXdrktg6OgnvpUMRB0Mn6/42m9kOMTSFTeLjce5emUQ7YOLVTRhvWCMiFQQEIpzgewv3+3l9uXX9W7W6GcdCUpc0mDVu0z8ToY0aIGUksrtgQkWLlUsiv4FlCELYvXac16HLqeeqjPvWP7rqgIV6UXqwZq/6ATg/vO2EW4iSshhmEx41wJZaStBwn4IVJCJPSzotS+a9R3rIw0pRtc7/kZ7v0nixm6bDe6ksv8rVP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"serviceaddress_id", Int64.Type}, {"readingdate", type datetime}, {"readingtype", type text}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"serviceaddress_id"}, {{"AllData", each _, type table [serviceaddress_id=nullable number, readingdate=nullable datetime, readingtype=nullable text]}}),
Custom1 = Table.TransformColumns(#"Grouped Rows", {"AllData", each Table.FillDown(Table.AddColumn(Table.AddIndexColumn(_, "Index", 1, 1), "consecutiveEstimate", each if [readingtype] = "Estimate" and [Index] = 1 then 1 else if [readingtype] = "Actual" then 0 else null), {"consecutiveEstimate"})}),
#"Removed Other Columns" = Table.SelectColumns(Custom1,{"AllData"}),
#"Expanded AllData" = Table.ExpandTableColumn(#"Removed Other Columns", "AllData", {"serviceaddress_id", "readingdate", "readingtype", "Index", "consecutiveEstimate"}, {"serviceaddress_id", "readingdate", "readingtype", "Index", "consecutiveEstimate"}),
#"Removed Columns" = Table.RemoveColumns(#"Expanded AllData",{"Index"})
in
#"Removed Columns"
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
Hi @El_Jefe
A Group By operation is helpful in these types of requirements, alternatively you can try this.
Just create a new blank query, open the Advanced Editor and replace everyting inside it with this code.
let
lookIn = List.Buffer(List.Zip({GroupRows[serviceaddress_id], GroupRows[readingtype]})),
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("lZI9C8MgEIb/SnAOeHqn0dsydCx0DxlC6VBop6b/v6aGfJUaAw4iz6P3ntc0wgEpJ0qhpCKpQVOhNAOEVVzO4fz06u/Prr8NiGjLhaBHA7MNddiAo4Y/IGgfc0PMjWx0Bh9iY3zBMvoMYZEamMxg1GkDpJ0zzDXV1/7dPcIGVniIHGnPRk23/4GdHGtxTMRISbgaexMKUUwVG0ziduqMZ0JGnaTN8nK0e4XT/seuBZR4iM8YZ9hOAn47r7ONMArVxvjJ3H4A", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t, consecutiveEstimate = _t]),
GroupRows = Table.Group(Source, {"serviceaddress_id", "readingtype"}, {{"t", each _, type table [serviceaddress_id=nullable text, readingdate=nullable text, readingtype=nullable text, consecutiveEstimate=nullable text]}}, GroupKind.Local),
addFirstOccurance = Table.RemoveColumns( Table.AddColumn(
Table.AddIndexColumn(GroupRows, "Index", 0, 1), "consecutiveEst", each
[
a = List.PositionOf( lookIn, {[serviceaddress_id], "Estimate"}),
b = if a = [Index] then 1 else 0
][b], Int64.Type ),
{"Index"}
),
ExpandTable = Table.ExpandTableColumn(addFirstOccurance, "t", {"consecutiveEstimate"})
in
ExpandTable
I hope this is helpful
Hi, @El_Jefe,
result:
v1
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI9C8IwEAbgv1IyC7ncXdrktg6OgnvpUMRB0Mn6/42m9kOMTSFTeLjce5emUQ7YOLVTRhvWCMiFQQEIpzgewv3+3l9uXX9W7W6GcdCUpc0mDVu0z8ToY0aIGUksrtgQkWLlUsiv4FlCELYvXac16HLqeeqjPvWP7rqgIV6UXqwZq/6ATg/vO2EW4iSshhmEx41wJZaStBwn4IVJCJPSzotS+a9R3rIw0pRtc7/kZ7v0nixm6bDe6ksv8rVP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
GroupedRows = Table.Group(Source, {"serviceaddress_id"}, {
{"All", each _, type table},
{"Status", each Table.AddColumn(_, "consecutiveEstimate", (x)=> if x[readingtype] = "Actual" or List.Contains(Table.FirstN(_, (y)=> y[readingdate] <> x[readingdate])[readingtype], "Actual") then 0 else 1, Int64.Type), type table}
}),
CombinedStatus = Table.Combine(GroupedRows[Status])
in
CombinedStatus
v2
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldKxDsIgEAbgV2mYmwB3B4XbOjiauDcdGuNgopP1/cVStVVpacJEvhz339E0winSTpRCS00SFFChgZUKpzjsw/3u1p+vXX8SbTnBMGrM0nqTVgldH/t7d5lZn1kYfEyoYkJkAys2BMRY2TL6FTzJp5jMU9e/HUerpP10DIlwAw3hovRs9FJNJ8fXHRMxUhJW4wTC45qpYoOR/k1l3xPwTMgIybJmWhbtUqu0ZWEoMdvmfsjXdnGYLWTpsN7qS8/ytQ8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
ConsecutiveEstimate =
[
t = Table.Buffer(Source),
zip = List.Zip({List.Distinct(t[serviceaddress_id]), List.Repeat({"Actual"}, List.Count(List.Distinct(t[serviceaddress_id])))}),
lg =
List.Generate(
()=> [
x = 0,
row = Record.ToList(t{x}[[serviceaddress_id], [readingtype]]),
check = List.Select(if List.ContainsAny(zip, row) then {row} else {{}}, each _ <> {}),
result = if List.Contains(List.Combine(check), row{0}) then 0 else 1
],
each [x] < Table.RowCount(t),
each [
x = [x]+1,
row = Record.ToList(t{x}[[serviceaddress_id], [readingtype]]),
check = List.Select(if List.ContainsAny(zip, {row}) and not List.ContainsAny([check], {row}) then [check] & {row} else [check], each _ <> {}),
result = if List.Contains(List.Combine(check), row{0}) then 0 else 1
],
each [result]
)
][lg],
Merged = Table.FromColumns(Table.ToColumns(Source) & {ConsecutiveEstimate}, Table.ColumnNames(Source) & {"Consecutive Estimate"})
in
Merged
let
Source = your_table,
f = (tbl as table) =>
[recs = Table.ToRecords(tbl),
gen = List.Generate(
() => [i = 0, c = recs{0}, sta = Byte.From(c[readingtype] = "Estimate"), r = c & [consecutiveEstimate = sta]],
(x) => x[i] < List.Count(recs),
(x) =>
[i = x[i] + 1,
c = recs{i},
sta = Byte.From(c[readingtype] = "Estimate" and x[sta] = 1),
r = c & [consecutiveEstimate = sta]],
(x) => x[r]
),
out = Table.FromRecords(gen)][out],
gr = Table.Group(Source, "serviceaddress_id", {{"all", each f(Table.Sort(_, {"readingdate", Order.Descending}))}}),
expand = Table.ExpandTableColumn(gr, "all", {"readingdate", "readingtype", "consecutiveEstimate"})
in
expand
let
Source = your_table,
f = (tbl as table) =>
[recs = Table.ToRecords(tbl),
gen = List.Generate(
() => [i = 0, c = recs{0}, sta = Byte.From(c[readingtype] = "Estimate"), r = c & [consecutiveEstimate = sta]],
(x) => x[i] < List.Count(recs),
(x) =>
[i = x[i] + 1,
c = recs{i},
sta = Byte.From(c[readingtype] = "Estimate" and x[sta] = 1),
r = c & [consecutiveEstimate = sta]],
(x) => x[r]
),
out = Table.FromRecords(gen)][out],
gr = Table.Group(Source, "serviceaddress_id", {{"all", each f(Table.Sort(_, {"readingdate", Order.Descending}))}}),
expand = Table.ExpandTableColumn(gr, "all", {"readingdate", "readingtype", "consecutiveEstimate"})
in
expand
Hi, @El_Jefe,
result:
v1
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI9C8IwEAbgv1IyC7ncXdrktg6OgnvpUMRB0Mn6/42m9kOMTSFTeLjce5emUQ7YOLVTRhvWCMiFQQEIpzgewv3+3l9uXX9W7W6GcdCUpc0mDVu0z8ToY0aIGUksrtgQkWLlUsiv4FlCELYvXac16HLqeeqjPvWP7rqgIV6UXqwZq/6ATg/vO2EW4iSshhmEx41wJZaStBwn4IVJCJPSzotS+a9R3rIw0pRtc7/kZ7v0nixm6bDe6ksv8rVP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
GroupedRows = Table.Group(Source, {"serviceaddress_id"}, {
{"All", each _, type table},
{"Status", each Table.AddColumn(_, "consecutiveEstimate", (x)=> if x[readingtype] = "Actual" or List.Contains(Table.FirstN(_, (y)=> y[readingdate] <> x[readingdate])[readingtype], "Actual") then 0 else 1, Int64.Type), type table}
}),
CombinedStatus = Table.Combine(GroupedRows[Status])
in
CombinedStatus
v2
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldKxDsIgEAbgV2mYmwB3B4XbOjiauDcdGuNgopP1/cVStVVpacJEvhz339E0winSTpRCS00SFFChgZUKpzjsw/3u1p+vXX8SbTnBMGrM0nqTVgldH/t7d5lZn1kYfEyoYkJkAys2BMRY2TL6FTzJp5jMU9e/HUerpP10DIlwAw3hovRs9FJNJ8fXHRMxUhJW4wTC45qpYoOR/k1l3xPwTMgIybJmWhbtUqu0ZWEoMdvmfsjXdnGYLWTpsN7qS8/ytQ8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
ConsecutiveEstimate =
[
t = Table.Buffer(Source),
zip = List.Zip({List.Distinct(t[serviceaddress_id]), List.Repeat({"Actual"}, List.Count(List.Distinct(t[serviceaddress_id])))}),
lg =
List.Generate(
()=> [
x = 0,
row = Record.ToList(t{x}[[serviceaddress_id], [readingtype]]),
check = List.Select(if List.ContainsAny(zip, row) then {row} else {{}}, each _ <> {}),
result = if List.Contains(List.Combine(check), row{0}) then 0 else 1
],
each [x] < Table.RowCount(t),
each [
x = [x]+1,
row = Record.ToList(t{x}[[serviceaddress_id], [readingtype]]),
check = List.Select(if List.ContainsAny(zip, {row}) and not List.ContainsAny([check], {row}) then [check] & {row} else [check], each _ <> {}),
result = if List.Contains(List.Combine(check), row{0}) then 0 else 1
],
each [result]
)
][lg],
Merged = Table.FromColumns(Table.ToColumns(Source) & {ConsecutiveEstimate}, Table.ColumnNames(Source) & {"Consecutive Estimate"})
in
Merged
Hi @El_Jefe
A Group By operation is helpful in these types of requirements, alternatively you can try this.
Just create a new blank query, open the Advanced Editor and replace everyting inside it with this code.
let
lookIn = List.Buffer(List.Zip({GroupRows[serviceaddress_id], GroupRows[readingtype]})),
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("lZI9C8MgEIb/SnAOeHqn0dsydCx0DxlC6VBop6b/v6aGfJUaAw4iz6P3ntc0wgEpJ0qhpCKpQVOhNAOEVVzO4fz06u/Prr8NiGjLhaBHA7MNddiAo4Y/IGgfc0PMjWx0Bh9iY3zBMvoMYZEamMxg1GkDpJ0zzDXV1/7dPcIGVniIHGnPRk23/4GdHGtxTMRISbgaexMKUUwVG0ziduqMZ0JGnaTN8nK0e4XT/seuBZR4iM8YZ9hOAn47r7ONMArVxvjJ3H4A", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t, consecutiveEstimate = _t]),
GroupRows = Table.Group(Source, {"serviceaddress_id", "readingtype"}, {{"t", each _, type table [serviceaddress_id=nullable text, readingdate=nullable text, readingtype=nullable text, consecutiveEstimate=nullable text]}}, GroupKind.Local),
addFirstOccurance = Table.RemoveColumns( Table.AddColumn(
Table.AddIndexColumn(GroupRows, "Index", 0, 1), "consecutiveEst", each
[
a = List.PositionOf( lookIn, {[serviceaddress_id], "Estimate"}),
b = if a = [Index] then 1 else 0
][b], Int64.Type ),
{"Index"}
),
ExpandTable = Table.ExpandTableColumn(addFirstOccurance, "t", {"consecutiveEstimate"})
in
ExpandTable
I hope this is helpful
Hi @El_Jefe
This is my solution with Power Query. You can use the attached sample file to see how it works. Or replace the source step with yours.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ldI9C8IwEAbgv1IyC7ncXdrktg6OgnvpUMRB0Mn6/42m9kOMTSFTeLjce5emUQ7YOLVTRhvWCMiFQQEIpzgewv3+3l9uXX9W7W6GcdCUpc0mDVu0z8ToY0aIGUksrtgQkWLlUsiv4FlCELYvXac16HLqeeqjPvWP7rqgIV6UXqwZq/6ATg/vO2EW4iSshhmEx41wJZaStBwn4IVJCJPSzotS+a9R3rIw0pRtc7/kZ7v0nixm6bDe6ksv8rVP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [serviceaddress_id = _t, readingdate = _t, readingtype = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"serviceaddress_id", Int64.Type}, {"readingdate", type datetime}, {"readingtype", type text}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"serviceaddress_id"}, {{"AllData", each _, type table [serviceaddress_id=nullable number, readingdate=nullable datetime, readingtype=nullable text]}}),
Custom1 = Table.TransformColumns(#"Grouped Rows", {"AllData", each Table.FillDown(Table.AddColumn(Table.AddIndexColumn(_, "Index", 1, 1), "consecutiveEstimate", each if [readingtype] = "Estimate" and [Index] = 1 then 1 else if [readingtype] = "Actual" then 0 else null), {"consecutiveEstimate"})}),
#"Removed Other Columns" = Table.SelectColumns(Custom1,{"AllData"}),
#"Expanded AllData" = Table.ExpandTableColumn(#"Removed Other Columns", "AllData", {"serviceaddress_id", "readingdate", "readingtype", "Index", "consecutiveEstimate"}, {"serviceaddress_id", "readingdate", "readingtype", "Index", "consecutiveEstimate"}),
#"Removed Columns" = Table.RemoveColumns(#"Expanded AllData",{"Index"})
in
#"Removed Columns"
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
72 | |
67 | |
42 | |
28 | |
21 |