Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers!
Enter the sweepstakes now!See when key Fabric features will launch and what’s already live, all in one place and always up to date. Explore the new Fabric roadmap
Hi all,
I'm trying to create a report which will give us a live feed on how much time each production line has gained / lost as a day goes on.
The intention is to have a live feed update each hour as the day goes on.
(So if time if 09:05, the measure will show only total minutes gained in hours 6, 7 and 😎
Production Day runs from Hour Number 6 to Hour Number 5 of the next day, then accumulated total should reset to zero.
I've currently no problem showing an accumulated total for an entire day in PBI but need to see a running total.
Table is "SIC Summary"
Data and "Accumulated Minutes" expected result below:
All help gratefully received!
Site | Line | Prod_Date | Hour Number | Minutes loss / gain | Accumlated Minutes |
Site1 | Line 1 | 18/03/2024 | 6 | 1.01 | 1.01 |
Site1 | Line 1 | 18/03/2024 | 7 | -2.32 | -1.32 |
Site1 | Line 1 | 18/03/2024 | 8 | 6.83 | 5.52 |
Site1 | Line 1 | 18/03/2024 | 9 | -3.84 | 1.67 |
Site1 | Line 1 | 18/03/2024 | 10 | -2.19 | -0.52 |
Site1 | Line 1 | 18/03/2024 | 11 | 9.28 | 8.77 |
Site1 | Line 1 | 18/03/2024 | 12 | 0.79 | 9.56 |
Site1 | Line 1 | 18/03/2024 | 13 | -3.88 | 5.68 |
Site1 | Line 1 | 18/03/2024 | 14 | -6.00 | -0.32 |
Site1 | Line 1 | 18/03/2024 | 15 | -48.00 | -48.32 |
Site1 | Line 1 | 18/03/2024 | 16 | -16.14 | -64.46 |
Site1 | Line 1 | 18/03/2024 | 17 | -1.38 | -65.84 |
Site1 | Line 1 | 18/03/2024 | 18 | -0.18 | -66.03 |
Site1 | Line 1 | 19/03/2024 | 6 | -6.37 | -6.37 |
Site1 | Line 1 | 19/03/2024 | 7 | 7.21 | 0.84 |
Site1 | Line 1 | 19/03/2024 | 8 | 6.43 | 7.27 |
Site1 | Line 1 | 19/03/2024 | 9 | -12.50 | -5.23 |
Site1 | Line 1 | 19/03/2024 | 10 | -6.29 | -11.52 |
Site1 | Line 1 | 19/03/2024 | 11 | 19.72 | 8.19 |
Site1 | Line 1 | 19/03/2024 | 12 | 48.03 | 56.22 |
Site1 | Line 1 | 19/03/2024 | 13 | 28.81 | 85.03 |
Site1 | Line 1 | 19/03/2024 | 14 | -27.50 | 57.53 |
Site1 | Line 1 | 19/03/2024 | 15 | -14.03 | 43.49 |
Site1 | Line 1 | 19/03/2024 | 16 | 18.10 | 61.59 |
Site1 | Line 1 | 19/03/2024 | 17 | -54.81 | 6.78 |
Site1 | Line 1 | 19/03/2024 | 18 | -8.12 | -1.34 |
Site1 | Line 2 | 19/03/2024 | 6 | -9.94 | -9.94 |
Site1 | Line 2 | 19/03/2024 | 7 | 0.66 | -9.28 |
Site1 | Line 2 | 19/03/2024 | 8 | -3.50 | -12.78 |
Site1 | Line 2 | 19/03/2024 | 9 | 13.39 | 0.61 |
Site1 | Line 2 | 19/03/2024 | 10 | -24.48 | -23.87 |
Site1 | Line 2 | 19/03/2024 | 11 | 5.77 | -18.11 |
Site1 | Line 2 | 19/03/2024 | 12 | 16.48 | -1.63 |
Site1 | Line 2 | 19/03/2024 | 13 | 27.88 | 26.25 |
Site1 | Line 2 | 19/03/2024 | 14 | -1.76 | 24.49 |
Site1 | Line 2 | 19/03/2024 | 15 | 12.95 | 37.44 |
Site1 | Line 2 | 19/03/2024 | 16 | -1.66 | 35.78 |
Solved! Go to Solution.
Hi @Bindi_Baji ,
Based on the sample data you provided, I did the following steps:
First, open power query and group by Line and Prod_Date
Then add the index column to the grouped table.
Then merge Line and Prod_Date into a new column
You can put the following code into the advanced editor
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZQxbsMwDEWvUniOGJGSJfEO3ToGGTNk6dT7oyLtyZKQP8SIlQeCId/X47H9vP9evN227/fv68u+cLvHdJcoub8UO6DI2/P2Ca39E4SSAGyz0tQSgKqVTdQywHI8emBFYDtRkoaw0h+RKlQ3nR1Dhe0RCkLuRmaoqG0tcCGGhuaLY0pQadtciMQrWC/yhEKpAqw1UUlWoukgT17Jo1d5WGgH2EOeQrJasl7lYaW6sl2v9uRGEWnZ9ZFGDRnFoY9U7A+6QZzBPjz6jaA2XKE9g027Q73yZHYydUhJJyYPbPWQlkmaBrQdGZ2MbUDVl0JposXAnhdQpllOR9p+2qlO8jGyflLAwu5QnV5BI5yP/Fdkbu5QD5QikzvvId/I8x8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Site = _t, Line = _t, Prod_Date = _t, #"Hour Number" = _t, #"Minutes loss / gain" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Site", type text}, {"Line", type text}, {"Prod_Date", type text}, {"Hour Number", Int64.Type}, {"Minutes loss / gain", type number}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Line", "Prod_Date"}, {{"Count", each _, type table [Site=nullable text, Line=nullable text, Prod_Date=nullable text, Hour Number=nullable number, #"Minutes loss / gain"=nullable number]}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.AddIndexColumn([Count],"Index",1,1)),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Count"}),
#"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns", "Custom", {"Site", "Hour Number", "Minutes loss / gain", "Index"}, {"Custom.Site", "Custom.Hour Number", "Custom.Minutes loss / gain", "Custom.Index"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Custom",{{"Custom.Hour Number", type number}, {"Custom.Minutes loss / gain", type number}, {"Custom.Index", type number}}),
#"Inserted Merged Column" = Table.AddColumn(#"Changed Type1", "Merged", each Text.Combine({[Prod_Date], [Line]}, ""), type text)
in
#"Inserted Merged Column"
Finally add a new calculated column
Accumlated Minutes =
CALCULATE(
SUM('Table'[Custom.Minutes loss / gain]),
FILTER(
'Table',
'Table'[Merged] = EARLIER('Table'[Merged]) && 'Table'[Custom.Index] <= EARLIER('Table'[Custom.Index])
)
)
Final output
Best regards,
Albert He
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly
Hi @Bindi_Baji ,
Based on the sample data you provided, I did the following steps:
First, open power query and group by Line and Prod_Date
Then add the index column to the grouped table.
Then merge Line and Prod_Date into a new column
You can put the following code into the advanced editor
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZQxbsMwDEWvUniOGJGSJfEO3ToGGTNk6dT7oyLtyZKQP8SIlQeCId/X47H9vP9evN227/fv68u+cLvHdJcoub8UO6DI2/P2Ca39E4SSAGyz0tQSgKqVTdQywHI8emBFYDtRkoaw0h+RKlQ3nR1Dhe0RCkLuRmaoqG0tcCGGhuaLY0pQadtciMQrWC/yhEKpAqw1UUlWoukgT17Jo1d5WGgH2EOeQrJasl7lYaW6sl2v9uRGEWnZ9ZFGDRnFoY9U7A+6QZzBPjz6jaA2XKE9g027Q73yZHYydUhJJyYPbPWQlkmaBrQdGZ2MbUDVl0JposXAnhdQpllOR9p+2qlO8jGyflLAwu5QnV5BI5yP/Fdkbu5QD5QikzvvId/I8x8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Site = _t, Line = _t, Prod_Date = _t, #"Hour Number" = _t, #"Minutes loss / gain" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Site", type text}, {"Line", type text}, {"Prod_Date", type text}, {"Hour Number", Int64.Type}, {"Minutes loss / gain", type number}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Line", "Prod_Date"}, {{"Count", each _, type table [Site=nullable text, Line=nullable text, Prod_Date=nullable text, Hour Number=nullable number, #"Minutes loss / gain"=nullable number]}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.AddIndexColumn([Count],"Index",1,1)),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Count"}),
#"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns", "Custom", {"Site", "Hour Number", "Minutes loss / gain", "Index"}, {"Custom.Site", "Custom.Hour Number", "Custom.Minutes loss / gain", "Custom.Index"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Custom",{{"Custom.Hour Number", type number}, {"Custom.Minutes loss / gain", type number}, {"Custom.Index", type number}}),
#"Inserted Merged Column" = Table.AddColumn(#"Changed Type1", "Merged", each Text.Combine({[Prod_Date], [Line]}, ""), type text)
in
#"Inserted Merged Column"
Finally add a new calculated column
Accumlated Minutes =
CALCULATE(
SUM('Table'[Custom.Minutes loss / gain]),
FILTER(
'Table',
'Table'[Merged] = EARLIER('Table'[Merged]) && 'Table'[Custom.Index] <= EARLIER('Table'[Custom.Index])
)
)
Final output
Best regards,
Albert He
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly
User | Count |
---|---|
91 | |
74 | |
71 | |
58 | |
55 |
User | Count |
---|---|
41 | |
38 | |
34 | |
32 | |
30 |