The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi everyone,
I'm developing a Power BI report to track manufacturing defects for my organization. We use a defect rate threshold that updates periodically based on trending data.
I've attached a sample file that reflects the general structure of our data—it's not an exact match, but it's close.
I already have measures in place to calculate both the threshold and the error rate. The challenge I’m facing is how to manage updates to the threshold over time. Specifically, I’m trying to figure out how to:
I’d really appreciate any guidance on how to accomplish this in Power BI.
Note: The file includes a "Proposed Error Threshold" table. This is not the format currently used in the model, but I would prefer a solution that uses it, if possible.
Thanks in advance for your help!
Solved! Go to Solution.
Hi @scoutmedic ,
Thanks for posting in Microsoft Fabric Community and providing the sample data and expected output.
I reproduced the scenario in Power Query using the provided structure. Here are the steps followed:
Converted the wide format to long format by unpivoting the threshold date columns into rows.
Sorted the data by Threshold Date, Category, and Category Detail to prepare for row-wise comparison.
Grouped the table by Category and Category Detail to handle threshold updates within each group.
Added an End Date column inside each group by identifying the next Threshold Date and subtracting one day. If there was no next date, I assigned 30-12-9999.
Expanded the grouped table back to a flat structure with the new End Date column.
Here is the M code used:
let
Source = Table.FromRows(
Json.Document(
Binary.Decompress(
Binary.FromText("i45WCsmvLFbSUXJOLAJRJiCsZ6QUqwOXCSkqTc4GMYxBGC6XnwMSC04uSi1PKcosSy1CURILAA==", BinaryEncoding.Base64),
Compression.Deflate
)
),
let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Category = _t, #"Category Detail" = _t, #"1/1/2025" = _t, #"3/1/2025" = _t]
),
#"Changed Type" = Table.TransformColumnTypes(Source, {
{"Category", type text},
{"Category Detail", type text},
{"1/1/2025", Int64.Type},
{"3/1/2025", type number}
}),
#"Unpivoted Only Selected Columns" = Table.Unpivot(#"Changed Type", {"3/1/2025", "1/1/2025"}, "Attribute", "Value"),
#"Renamed Columns" = Table.RenameColumns(#"Unpivoted Only Selected Columns", {{"Attribute", "Threshold Date"}, {"Value", "Threshold"}}),
#"Sorted Rows" = Table.Sort(#"Renamed Columns", {
{"Threshold Date", Order.Ascending},
{"Category", Order.Ascending},
{"Category Detail", Order.Ascending}
}),
#"Grouped Rows" = Table.Group(#"Sorted Rows", {"Category", "Category Detail"}, {
{"All Rows", each _, type table [Category=nullable text, Category Detail=nullable text, Threshold Date=text, Threshold=number]}
}),
#"Added End Date to Groups" = Table.AddColumn(#"Grouped Rows", "AllRowsWithEndDate", each
Table.AddColumn(
[All Rows],
"End Date",
(row) =>
let
currentDate = Date.FromText(row[Threshold Date]),
futureDates = List.Select(
List.Transform([All Rows][Threshold Date], each Date.FromText(_)),
each _ > currentDate
),
nextDate = if List.IsEmpty(futureDates) then #date(9999,12,31) else List.Min(futureDates)
in
nextDate - #duration(1,0,0,0)
)
),
#"Removed Other Columns" = Table.SelectColumns(#"Added End Date to Groups", {"AllRowsWithEndDate"}),
#"Expanded AllRowsWithEndDate" = Table.ExpandTableColumn(#"Removed Other Columns", "AllRowsWithEndDate", {"Category", "Category Detail", "Threshold Date", "Threshold", "End Date"})
in
#"Expanded AllRowsWithEndDate"
This approach avoids editing the original threshold values and assigns valid End Dates dynamically based on each threshold update.
Hope this helps. Please reach out for further assistance.
Attached .pbix file for reference.
Thank you.
Hi @scoutmedic ,
Thanks for posting in Microsoft Fabric Community and providing the sample data and expected output.
I reproduced the scenario in Power Query using the provided structure. Here are the steps followed:
Converted the wide format to long format by unpivoting the threshold date columns into rows.
Sorted the data by Threshold Date, Category, and Category Detail to prepare for row-wise comparison.
Grouped the table by Category and Category Detail to handle threshold updates within each group.
Added an End Date column inside each group by identifying the next Threshold Date and subtracting one day. If there was no next date, I assigned 30-12-9999.
Expanded the grouped table back to a flat structure with the new End Date column.
Here is the M code used:
let
Source = Table.FromRows(
Json.Document(
Binary.Decompress(
Binary.FromText("i45WCsmvLFbSUXJOLAJRJiCsZ6QUqwOXCSkqTc4GMYxBGC6XnwMSC04uSi1PKcosSy1CURILAA==", BinaryEncoding.Base64),
Compression.Deflate
)
),
let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Category = _t, #"Category Detail" = _t, #"1/1/2025" = _t, #"3/1/2025" = _t]
),
#"Changed Type" = Table.TransformColumnTypes(Source, {
{"Category", type text},
{"Category Detail", type text},
{"1/1/2025", Int64.Type},
{"3/1/2025", type number}
}),
#"Unpivoted Only Selected Columns" = Table.Unpivot(#"Changed Type", {"3/1/2025", "1/1/2025"}, "Attribute", "Value"),
#"Renamed Columns" = Table.RenameColumns(#"Unpivoted Only Selected Columns", {{"Attribute", "Threshold Date"}, {"Value", "Threshold"}}),
#"Sorted Rows" = Table.Sort(#"Renamed Columns", {
{"Threshold Date", Order.Ascending},
{"Category", Order.Ascending},
{"Category Detail", Order.Ascending}
}),
#"Grouped Rows" = Table.Group(#"Sorted Rows", {"Category", "Category Detail"}, {
{"All Rows", each _, type table [Category=nullable text, Category Detail=nullable text, Threshold Date=text, Threshold=number]}
}),
#"Added End Date to Groups" = Table.AddColumn(#"Grouped Rows", "AllRowsWithEndDate", each
Table.AddColumn(
[All Rows],
"End Date",
(row) =>
let
currentDate = Date.FromText(row[Threshold Date]),
futureDates = List.Select(
List.Transform([All Rows][Threshold Date], each Date.FromText(_)),
each _ > currentDate
),
nextDate = if List.IsEmpty(futureDates) then #date(9999,12,31) else List.Min(futureDates)
in
nextDate - #duration(1,0,0,0)
)
),
#"Removed Other Columns" = Table.SelectColumns(#"Added End Date to Groups", {"AllRowsWithEndDate"}),
#"Expanded AllRowsWithEndDate" = Table.ExpandTableColumn(#"Removed Other Columns", "AllRowsWithEndDate", {"Category", "Category Detail", "Threshold Date", "Threshold", "End Date"})
in
#"Expanded AllRowsWithEndDate"
This approach avoids editing the original threshold values and assigns valid End Dates dynamically based on each threshold update.
Hope this helps. Please reach out for further assistance.
Attached .pbix file for reference.
Thank you.
Hi @scoutmedic ,
Just wanted to check if the response provided was helpful. If further assistance is needed, please reach out.
Thank you.
Bumping for visibility
You are looking for a "Slowly Changing Dimension" scenario, likely SCD2. Keep track of your threshold changes in that dimension table and adjust your measures to include date range filters.
No need for SCD2. We perform internal analysis of defect rates to determine our acceptable thresholds. I just need to adjust based on date filters.
One challenge is that we have about 100 various defects we track. Not sure if this affects a solution or not.
User | Count |
---|---|
25 | |
10 | |
8 | |
6 | |
5 |
User | Count |
---|---|
31 | |
11 | |
10 | |
10 | |
9 |