Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Bindi_Baji
Frequent Visitor

Accumulated Minute with Multiple Criteria

Hi all,
I'm trying to create a report which will give us a live feed on how much time each production line has gained / lost as a day goes on.

The intention is to have a live feed update each hour as the day goes on.

(So if time if 09:05, the measure will show only total minutes gained in hours 6, 7 and 😎 

Production Day runs from Hour Number 6  to Hour Number 5 of the next day, then accumulated total should reset to zero.

 

I've currently no problem showing an accumulated total for an entire day in PBI but need to see a running total.

 

Table is "SIC Summary"

Data and "Accumulated Minutes" expected result below:

All help gratefully received!

 

SiteLineProd_DateHour NumberMinutes loss / gainAccumlated Minutes
Site1Line 118/03/202461.011.01
Site1Line 118/03/20247-2.32-1.32
Site1Line 118/03/202486.835.52
Site1Line 118/03/20249-3.841.67
Site1Line 118/03/202410-2.19-0.52
Site1Line 118/03/2024119.288.77
Site1Line 118/03/2024120.799.56
Site1Line 118/03/202413-3.885.68
Site1Line 118/03/202414-6.00-0.32
Site1Line 118/03/202415-48.00-48.32
Site1Line 118/03/202416-16.14-64.46
Site1Line 118/03/202417-1.38-65.84
Site1Line 118/03/202418-0.18-66.03
Site1Line 119/03/20246-6.37-6.37
Site1Line 119/03/202477.210.84
Site1Line 119/03/202486.437.27
Site1Line 119/03/20249-12.50-5.23
Site1Line 119/03/202410-6.29-11.52
Site1Line 119/03/20241119.728.19
Site1Line 119/03/20241248.0356.22
Site1Line 119/03/20241328.8185.03
Site1Line 119/03/202414-27.5057.53
Site1Line 119/03/202415-14.0343.49
Site1Line 119/03/20241618.1061.59
Site1Line 119/03/202417-54.816.78
Site1Line 119/03/202418-8.12-1.34
Site1Line 219/03/20246-9.94-9.94
Site1Line 219/03/202470.66-9.28
Site1Line 219/03/20248-3.50-12.78
Site1Line 219/03/2024913.390.61
Site1Line 219/03/202410-24.48-23.87
Site1Line 219/03/2024115.77-18.11
Site1Line 219/03/20241216.48-1.63
Site1Line 219/03/20241327.8826.25
Site1Line 219/03/202414-1.7624.49
Site1Line 219/03/20241512.9537.44
Site1Line 219/03/202416-1.6635.78
1 ACCEPTED SOLUTION
v-heq-msft
Community Support
Community Support

Hi @Bindi_Baji ,
Based on the sample data you provided, I did the following steps:
First, open power query and group by Line and Prod_Date

vheqmsft_1-1714099514216.png

Then add the index column to the grouped table.

vheqmsft_2-1714099577992.png

Then merge Line and Prod_Date into a new column

vheqmsft_3-1714099653060.png
You can put the following code into the advanced editor

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZQxbsMwDEWvUniOGJGSJfEO3ToGGTNk6dT7oyLtyZKQP8SIlQeCId/X47H9vP9evN227/fv68u+cLvHdJcoub8UO6DI2/P2Ca39E4SSAGyz0tQSgKqVTdQywHI8emBFYDtRkoaw0h+RKlQ3nR1Dhe0RCkLuRmaoqG0tcCGGhuaLY0pQadtciMQrWC/yhEKpAqw1UUlWoukgT17Jo1d5WGgH2EOeQrJasl7lYaW6sl2v9uRGEWnZ9ZFGDRnFoY9U7A+6QZzBPjz6jaA2XKE9g027Q73yZHYydUhJJyYPbPWQlkmaBrQdGZ2MbUDVl0JposXAnhdQpllOR9p+2qlO8jGyflLAwu5QnV5BI5yP/Fdkbu5QD5QikzvvId/I8x8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Site = _t, Line = _t, Prod_Date = _t, #"Hour Number" = _t, #"Minutes loss / gain" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Site", type text}, {"Line", type text}, {"Prod_Date", type text}, {"Hour Number", Int64.Type}, {"Minutes loss / gain", type number}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Line", "Prod_Date"}, {{"Count", each _, type table [Site=nullable text, Line=nullable text, Prod_Date=nullable text, Hour Number=nullable number, #"Minutes loss / gain"=nullable number]}}),
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.AddIndexColumn([Count],"Index",1,1)),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Count"}),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns", "Custom", {"Site", "Hour Number", "Minutes loss / gain", "Index"}, {"Custom.Site", "Custom.Hour Number", "Custom.Minutes loss / gain", "Custom.Index"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Expanded Custom",{{"Custom.Hour Number", type number}, {"Custom.Minutes loss / gain", type number}, {"Custom.Index", type number}}),
    #"Inserted Merged Column" = Table.AddColumn(#"Changed Type1", "Merged", each Text.Combine({[Prod_Date], [Line]}, ""), type text)
in
    #"Inserted Merged Column"

 


Finally add a new calculated column

 

Accumlated Minutes = 
CALCULATE(
    SUM('Table'[Custom.Minutes loss / gain]),
    FILTER(
        'Table',
        'Table'[Merged] = EARLIER('Table'[Merged]) && 'Table'[Custom.Index] <= EARLIER('Table'[Custom.Index])
    )
)

 

Final output

vheqmsft_5-1714100064592.png

 

Best regards,

Albert He

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

 

 

 

 

View solution in original post

1 REPLY 1
v-heq-msft
Community Support
Community Support

Hi @Bindi_Baji ,
Based on the sample data you provided, I did the following steps:
First, open power query and group by Line and Prod_Date

vheqmsft_1-1714099514216.png

Then add the index column to the grouped table.

vheqmsft_2-1714099577992.png

Then merge Line and Prod_Date into a new column

vheqmsft_3-1714099653060.png
You can put the following code into the advanced editor

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZQxbsMwDEWvUniOGJGSJfEO3ToGGTNk6dT7oyLtyZKQP8SIlQeCId/X47H9vP9evN227/fv68u+cLvHdJcoub8UO6DI2/P2Ca39E4SSAGyz0tQSgKqVTdQywHI8emBFYDtRkoaw0h+RKlQ3nR1Dhe0RCkLuRmaoqG0tcCGGhuaLY0pQadtciMQrWC/yhEKpAqw1UUlWoukgT17Jo1d5WGgH2EOeQrJasl7lYaW6sl2v9uRGEWnZ9ZFGDRnFoY9U7A+6QZzBPjz6jaA2XKE9g027Q73yZHYydUhJJyYPbPWQlkmaBrQdGZ2MbUDVl0JposXAnhdQpllOR9p+2qlO8jGyflLAwu5QnV5BI5yP/Fdkbu5QD5QikzvvId/I8x8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Site = _t, Line = _t, Prod_Date = _t, #"Hour Number" = _t, #"Minutes loss / gain" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Site", type text}, {"Line", type text}, {"Prod_Date", type text}, {"Hour Number", Int64.Type}, {"Minutes loss / gain", type number}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Line", "Prod_Date"}, {{"Count", each _, type table [Site=nullable text, Line=nullable text, Prod_Date=nullable text, Hour Number=nullable number, #"Minutes loss / gain"=nullable number]}}),
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.AddIndexColumn([Count],"Index",1,1)),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Count"}),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns", "Custom", {"Site", "Hour Number", "Minutes loss / gain", "Index"}, {"Custom.Site", "Custom.Hour Number", "Custom.Minutes loss / gain", "Custom.Index"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Expanded Custom",{{"Custom.Hour Number", type number}, {"Custom.Minutes loss / gain", type number}, {"Custom.Index", type number}}),
    #"Inserted Merged Column" = Table.AddColumn(#"Changed Type1", "Merged", each Text.Combine({[Prod_Date], [Line]}, ""), type text)
in
    #"Inserted Merged Column"

 


Finally add a new calculated column

 

Accumlated Minutes = 
CALCULATE(
    SUM('Table'[Custom.Minutes loss / gain]),
    FILTER(
        'Table',
        'Table'[Merged] = EARLIER('Table'[Merged]) && 'Table'[Custom.Index] <= EARLIER('Table'[Custom.Index])
    )
)

 

Final output

vheqmsft_5-1714100064592.png

 

Best regards,

Albert He

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

 

 

 

 

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.