Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers!
Enter the sweepstakes now!Prepping for a Fabric certification exam? Join us for a live prep session with exam experts to learn how to pass the exam. Register now.
Hi All,
I have the scan data of work performed by operator for each day, i was trying to find the time duration between the first scan & the last scan for the day for each day by operator.
Raw data
desired total minutes result like any of below
i am struggling with what kind of formula to use in the raw data so i get a desired result as the final photo.
any help is highly appreciated.
Thanks.
Solved! Go to Solution.
Hello @Anonymous
open the adanced editor, put a "," at the end of your last step, paste this code before the in-keywords
#"Grouped Rows" = Table.Group
(
#"Renamed Columns1",
{"Operator", "Date"},
{
{
"Total minutes scan",
(tablescan)=> Duration.TotalMinutes(List.Max(tablescan[Finish])- List.Min(tablescan[Start])),
type number
}
}
)
put this variable #"Grouped Rows" instead of your #"Renamed Columns1" after the in-keyword
Copy paste this code to the advanced editor in a new blank query to see how the solution works.
If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too
Have fun
Jimmy
Hello @Anonymous
you can use a group-by function that makes the job.
{
{
"Total minutes scan",
(tablescan)=> Duration.TotalMinutes(List.Max(tablescan[Scan finish])- List.Min(tablescan[Scan start])),
type number
}
}
Check out the complete example (is not your exact data)
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTI20DOw1DMyQGIqmFgZGAARqpApWChWh346Dc2QtLqUJ2bi0muE31Z8Wgk4eEhoNbTADGIDQz1DA4hWGFPBwAKuFy5maAAVo7NeI0MkvTD/YtNsaIzFYmMrwxGn2RQWYLEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Operator = _t, Date = _t, #"Scan start" = _t, #"Scan finish" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Operator", type text}, {"Date", type date, "de-DE"}, {"Scan start", type datetime, "de-DE"}, {"Scan finish", type datetime,"de-DE"}}),
#"Grouped Rows" = Table.Group
(
#"Changed Type",
{"Operator", "Date"},
{
{
"Total minutes scan",
(tablescan)=> Duration.TotalMinutes(List.Max(tablescan[Scan finish])- List.Min(tablescan[Scan start])),
type number
}
}
)
in
#"Grouped Rows"
Copy paste this code to the advanced editor in a new blank query to see how the solution works.
If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too
Have fun
Jimmy
Hi @Jimmy801 ,
i am a very basic user of PBI, and really sorry that i dont understand your full message. i donot know where to use those 2 codes you send mentioned.
please will you be kind enough to attach the file of the example, so i can see the steps in the query.
looking forward your help.
hello @Anonymous
there is no need to post the file. Create a new blank query, open the advanced editor and copy past my whole code. So you can see how it works. You can substitute the first two steps of my solution with your code (when you registered it). The only thing you have to to to replace the previous step "changed type" with your real previous step name
If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too
Have fun
Jimmy
Hi @Jimmy801 ,
Many thanks for help. this code is working super. but i have one last question. what is the meaning of this code? what is the thinking process behind it?
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8srPyFPSUTI20DOw1DMyQGIqmFgZGAARqpApWChWh346Dc2QtLqUJ2bi0muE31Z8Wgk4eEhoNbTADGIDQz1DA4hWGFPBwAKuFy5maAAVo7NeI0MkvTD/YtNsaIzFYmMrwxGn2RQWYLEA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Operator = _t, Date = _t, #"Scan start" = _t, #"Scan finish" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Operator", type text}, {"Date", type date, "de-DE"}, {"Scan start", type datetime, "de-DE"}, {"Scan finish", type datetime,"de-DE"}}),
#"Grouped Rows" = Table.Group
(
#"Changed Type",
{"Operator", "Date"},
{
{
"Total minutes scan",
(tablescan)=> Duration.TotalMinutes(List.Max(tablescan[Scan finish])- List.Min(tablescan[Scan start])),
type number
}
}
)
in
#"Grouped Rows"
Hi @Jimmy801 ,
example i have this data now, how can i able to find the similar thing for this?what steps do i need to follow to get the total time for each oeprator each day considering fist & last scan time for that day.
when i copy and past the code you send me it is showing me the different table
Hello @Anonymous
open the adanced editor, put a "," at the end of your last step, paste this code before the in-keywords
#"Grouped Rows" = Table.Group
(
#"Renamed Columns1",
{"Operator", "Date"},
{
{
"Total minutes scan",
(tablescan)=> Duration.TotalMinutes(List.Max(tablescan[Finish])- List.Min(tablescan[Start])),
type number
}
}
)
put this variable #"Grouped Rows" instead of your #"Renamed Columns1" after the in-keyword
Copy paste this code to the advanced editor in a new blank query to see how the solution works.
If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too
Have fun
Jimmy
Hi @Jimmy801 ,
Many thanks for explanation, now i understand the principle.
I did as explained, but getting some error in the end
let
Source = SharePoint.Tables("https://amfurnitura.sharepoint.com/sites/PRODUCTIONPLAN", [ApiVersion = 15]),
#"e93d9951-2f27-468b-9ff0-ab6a002c6fbf" = Source{[Id="e93d9951-2f27-468b-9ff0-ab6a002c6fbf"]}[Items],
#"Renamed Columns" = Table.RenameColumns(#"e93d9951-2f27-468b-9ff0-ab6a002c6fbf",{{"ID", "ID.1"}}),
#"Removed Other Columns" = Table.SelectColumns(#"Renamed Columns",{"Orderno", "Product", "Process", "Start", "Operator", "Finish"}),
#"Reordered Columns" = Table.ReorderColumns(#"Removed Other Columns",{"Orderno", "Product", "Process", "Operator", "Start", "Finish"}),
#"Removed Other Columns1" = Table.SelectColumns(#"Reordered Columns",{"Operator", "Start", "Finish"}),
#"Duplicated Column" = Table.DuplicateColumn(#"Removed Other Columns1", "Start", "Start - Copy"),
#"Changed Type" = Table.TransformColumnTypes(#"Duplicated Column",{{"Start - Copy", type date}, {"Start", type datetime}, {"Finish", type datetime}}),
#"Reordered Columns1" = Table.ReorderColumns(#"Changed Type",{"Operator", "Start - Copy", "Start", "Finish"}),
#"Renamed Columns1" = Table.RenameColumns(#"Reordered Columns1",{{"Start - Copy", "Date"}}),
#"Grouped Rows" = Table.Group
(
#"Renamed Columns1",
{"Operator", "Date"},
{
{
"Total minutes scan",
(#"Aluminium Output (2)" > Duration.TotalMinutes(List.Max(#"Aluminium Output (2)" [Finish])- List.Min(#"Aluminium Output (2)")[Start])),
type number
}
}
)
in #"Grouped Rows"
please let me know what adjustment i need to make to solve this issue?
thanks all good. it works.many thanks for your knowledge and help.
Hello @Anonymous
the group-function is providing the grouped rows as table representet as "tablescan" in my code. I'm using the Max.Value of the finish date minus the min-value of the start date. In Power Query datetime - datetime is a duration value. With the function duration.TotalMinutes you get exactly what you needed.
If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too
Have fun
Jimmy
Hi @Anonymous
You could do it with Group By. If you post the initial table in text-tabular format so that the data can be copied we can get into the details.
Please mark the question solved when done and consider giving kudos if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Check out the April 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.