Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
Zneta
Frequent Visitor

Use Power Query to Subtract time frames form a complete day

Hello everybody,

 

There might be an easier solution to my problem, or to not do it in Power Query at all. Any help is appreciated:

 

I have a Dataset with start and end times like this:

Start: 10:30 am, End: 10:33 am

Start: 10:31 am, End 10:45 am

 

See the picture below.

 

Many of these timeframes overlap each other. I want a way to show me the time (in seconds or hours) of a full 24 hour day, where non of these time frames happend. So if a day has 1440 minutes and i only have two time frames (10:30 am to 10:45 am and 10:40 am to 10:45 am) i want the result to be 1425 minutes (and not 1420).

I hope this makes sense.

 

Thanks

Capture9.JPG

4 REPLIES 4
Zneta
Frequent Visitor

 

 

 

 

Thanks for the answers I will test all suggested solutions. As requested here is the data that i have. My expected result should be a timeframe in seconds where none of these timeframes happen.
Ideally i want to have a visual Timeline that displays the time frames in red and the times inbetween the given frames in green.

 

Start                                              End

5/14/2024 6:21:03 AM -05:005/14/2024 6:49:19 AM -05:00
5/14/2024 6:31:39 AM -05:005/14/2024 6:31:44 AM -05:00
5/14/2024 6:36:39 AM -05:005/14/2024 6:38:32 AM -05:00
5/14/2024 6:37:36 AM -05:005/14/2024 6:38:06 AM -05:00
5/14/2024 6:55:14 AM -05:005/14/2024 6:55:32 AM -05:00
5/14/2024 6:55:37 AM -05:005/14/2024 6:55:51 AM -05:00
5/14/2024 7:06:01 AM -05:005/14/2024 7:06:15 AM -05:00
5/14/2024 7:16:03 AM -05:005/14/2024 7:17:57 AM -05:00
5/14/2024 7:43:26 AM -05:005/14/2024 7:43:26 AM -05:00
5/14/2024 7:43:27 AM -05:005/14/2024 7:47:12 AM -05:00
5/14/2024 7:49:21 AM -05:005/14/2024 7:52:00 AM -05:00
5/14/2024 7:58:30 AM -05:005/14/2024 7:59:02 AM -05:00
5/14/2024 8:00:21 AM -05:005/14/2024 8:00:21 AM -05:00
5/14/2024 8:00:22 AM -05:005/14/2024 8:00:37 AM -05:00
5/14/2024 8:01:43 AM -05:005/14/2024 8:03:31 AM -05:00
5/14/2024 8:03:09 AM -05:005/14/2024 8:05:10 AM -05:00
5/14/2024 8:04:39 AM -05:005/14/2024 8:04:39 AM -05:00
5/14/2024 8:04:40 AM -05:005/14/2024 8:05:49 AM -05:00
5/14/2024 8:10:08 AM -05:005/14/2024 8:10:08 AM -05:00
5/14/2024 8:10:09 AM -05:005/14/2024 8:10:56 AM -05:00
5/14/2024 8:17:04 AM -05:005/14/2024 8:17:56 AM -05:00
5/14/2024 8:23:24 AM -05:005/14/2024 8:24:13 AM -05:00
5/14/2024 8:26:22 AM -05:005/14/2024 8:35:36 AM -05:00
5/14/2024 8:30:59 AM -05:005/14/2024 8:30:59 AM -05:00
5/14/2024 8:31:42 AM -05:005/14/2024 8:32:53 AM -05:00
5/14/2024 8:34:39 AM -05:005/14/2024 8:35:32 AM -05:00
5/14/2024 8:36:39 AM -05:005/14/2024 8:38:04 AM -05:00
5/14/2024 8:40:53 AM -05:005/14/2024 8:42:15 AM -05:00
5/14/2024 8:51:40 AM -05:005/14/2024 8:51:49 AM -05:00
5/14/2024 9:01:52 AM -05:005/14/2024 9:01:52 AM -05:00
5/14/2024 9:03:16 AM -05:005/14/2024 9:03:21 AM -05:00
5/14/2024 9:09:00 AM -05:005/14/2024 9:10:01 AM -05:00
5/14/2024 9:12:04 AM -05:005/14/2024 9:12:57 AM -05:00
5/14/2024 9:15:13 AM -05:005/14/2024 9:15:14 AM -05:00
5/14/2024 9:15:44 AM -05:005/14/2024 9:16:18 AM -05:00
5/14/2024 9:18:49 AM -05:005/14/2024 9:18:59 AM -05:00
5/14/2024 9:25:06 AM -05:005/14/2024 9:26:15 AM -05:00
5/14/2024 9:27:30 AM -05:005/14/2024 9:27:31 AM -05:00
5/14/2024 9:31:11 AM -05:005/14/2024 9:32:01 AM -05:00
5/14/2024 9:33:31 AM -05:005/14/2024 9:33:39 AM -05:00
5/14/2024 9:33:54 AM -05:005/14/2024 9:34:06 AM -05:00
5/14/2024 9:34:15 AM -05:005/14/2024 9:34:21 AM -05:00
5/14/2024 9:35:21 AM -05:005/14/2024 9:35:52 AM -05:00
5/14/2024 9:37:30 AM -05:005/14/2024 9:39:41 AM -05:00
5/14/2024 9:37:33 AM -05:005/14/2024 9:39:32 AM -05:00
5/14/2024 9:45:50 AM -05:005/14/2024 9:46:59 AM -05:00
5/14/2024 9:46:22 AM -05:005/14/2024 9:47:27 AM -05:00
5/14/2024 9:46:22 AM -05:005/14/2024 9:46:22 AM -05:00
5/14/2024 9:49:11 AM -05:005/14/2024 9:50:00 AM -05:00
5/14/2024 9:52:15 AM -05:005/14/2024 9:53:11 AM -05:00
5/14/2024 10:00:12 AM -05:005/14/2024 10:00:17 AM -05:00
5/14/2024 10:10:06 AM -05:005/14/2024 10:15:39 AM -05:00
5/14/2024 10:17:41 AM -05:005/14/2024 10:17:43 AM -05:00
5/14/2024 10:17:57 AM -05:005/14/2024 10:19:35 AM -05:00
5/14/2024 10:21:40 AM -05:005/14/2024 10:21:41 AM -05:00
5/14/2024 10:23:01 AM -05:005/14/2024 10:23:01 AM -05:00
5/14/2024 10:23:56 AM -05:005/14/2024 10:23:57 AM -05:00
5/14/2024 10:23:57 AM -05:005/14/2024 10:24:05 AM -05:00
5/14/2024 10:24:12 AM -05:005/14/2024 10:24:25 AM -05:00
5/14/2024 10:37:17 AM -05:005/14/2024 10:37:17 AM -05:00
5/14/2024 10:37:17 AM -05:005/14/2024 10:38:00 AM -05:00
5/14/2024 10:40:24 AM -05:005/14/2024 10:41:02 AM -05:00
5/14/2024 10:46:45 AM -05:005/14/2024 11:34:35 AM -05:00
5/14/2024 11:36:04 AM -05:005/14/2024 11:36:40 AM -05:00
5/14/2024 11:39:30 AM -05:005/14/2024 11:39:36 AM -05:00
5/14/2024 11:46:11 AM -05:005/14/2024 11:48:17 AM -05:00
5/14/2024 11:51:59 AM -05:005/14/2024 11:53:03 AM -05:00
5/14/2024 12:03:20 PM -05:005/14/2024 12:05:43 PM -05:00
5/14/2024 12:04:52 PM -05:005/14/2024 12:04:52 PM -05:00
5/14/2024 12:05:27 PM -05:005/14/2024 12:06:15 PM -05:00
5/14/2024 12:07:49 PM -05:005/14/2024 12:11:51 PM -05:00
5/14/2024 12:11:53 PM -05:005/14/2024 12:12:39 PM -05:00
5/14/2024 12:19:45 PM -05:005/14/2024 12:19:51 PM -05:00
5/14/2024 12:20:57 PM -05:005/14/2024 12:23:46 PM -05:00
5/14/2024 12:21:05 PM -05:005/14/2024 12:22:54 PM -05:00
5/14/2024 12:30:22 PM -05:005/14/2024 12:31:12 PM -05:00
5/14/2024 12:30:24 PM -05:005/14/2024 12:30:31 PM -05:00
5/14/2024 12:30:26 PM -05:005/14/2024 12:30:35 PM -05:00
5/14/2024 12:32:24 PM -05:005/14/2024 12:33:12 PM -05:00
5/14/2024 12:32:51 PM -05:005/14/2024 12:37:45 PM -05:00
5/14/2024 12:34:36 PM -05:005/14/2024 12:36:07 PM -05:00
5/14/2024 12:40:09 PM -05:005/14/2024 12:40:09 PM -05:00
5/14/2024 1:06:04 PM -05:005/14/2024 1:16:49 PM -05:00
5/14/2024 1:06:04 PM -05:005/14/2024 1:06:04 PM -05:00
5/14/2024 1:20:38 PM -05:005/14/2024 1:20:38 PM -05:00
5/14/2024 1:20:38 PM -05:005/14/2024 1:21:47 PM -05:00
5/14/2024 1:32:56 PM -05:005/14/2024 1:33:03 PM -05:00
5/14/2024 1:38:43 PM -05:005/14/2024 1:39:55 PM -05:00
5/14/2024 1:41:35 PM -05:005/14/2024 1:43:30 PM -05:00
5/14/2024 1:42:12 PM -05:005/14/2024 1:53:40 PM -05:00
5/14/2024 1:53:46 PM -05:005/14/2024 1:53:58 PM -05:00
5/14/2024 1:55:14 PM -05:005/14/2024 1:55:14 PM -05:00
5/14/2024 1:55:14 PM -05:005/14/2024 1:56:19 PM -05:00
5/14/2024 1:55:45 PM -05:005/14/2024 1:56:10 PM -05:00
5/14/2024 2:02:09 PM -05:005/14/2024 2:02:09 PM -05:00
collinsg
Super User
Super User

Good day Zneta

I've interpreted your question a little differently from Ibanez2000 - so my reply may not be relevant. Nevertheless your question was interesting and perhaps what I describe may help somebody. My approach is a bit of "brute force".

  • For each start and finish pair of timestamps I calculate a list of all the times from start to end (my example uses an interval of one second).
  • I then take a union of all these lists. This yields a list of all fault seconds.
  • I convert that list to a table of datetimes and add a date column.
  • I group by date and a count of rows gives the count of fault seconds in each date.
  • Finally I calculate the fault free seconds.

While it's brute force and may not perform well on very large data sets it should be robust for all cases e.g. faults straddling midnight or multiple days.

 

Starting from this...

collinsg_0-1718954729567.png

I get this...

collinsg_1-1718954752023.png

Using this code

let
source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jdBBCoAwDETRq4SuC50kjWKvUnL/a0gFwWpEt59HYNJ7EkgtsMJKLA1ogpSfVZE8R9oiXW+6EuvomG6fld+0/dEWavvQjFDLrBcSbbaN9Re90jFlfMp9Bw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Start = _t, End = _t]),
#"Changed Type" = Table.TransformColumnTypes(source,{{"Start", type datetime}, {"End", type datetime}}),
#"Added Lists of Fault Seconds" = Table.AddColumn(#"Changed Type", "Fault Periods", each List.DateTimes([Start],Duration.TotalSeconds([End]-[Start]),#duration(0,0,0,1))),
#"Find the Union of All Fault Seconds" = List.Union( #"Added Lists of Fault Seconds"[Fault Periods] ),
#"Converted to Table" = Table.FromList(#"Find the Union of All Fault Seconds", Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Changed Type to datetime" = Table.TransformColumnTypes(#"Converted to Table",{{"Column1", type datetime}}),
#"Inserted Date" = Table.AddColumn(#"Changed Type to datetime", "Date", each DateTime.Date([Column1]), type date),
#"Grouped by Date, Counting Fault Seconds" = Table.Group(#"Inserted Date", {"Date"}, {{"Fault Seconds", each Table.RowCount(_), Int64.Type}}),
#"Calculate Fault Free Seconds" = Table.AddColumn(#"Grouped by Date, Counting Fault Seconds", "Fault Free Seconds", each 86400 - [Fault Seconds], Int64.Type)
in
#"Calculate Fault Free Seconds"

Hope this helps.

 

WanderingBI
Resolver III
Resolver III

This is tricky. You can take a look at my solution to find all the overlap times. To check the logic you can first remove all but the first two rows in my example table.

 

let
  source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jcy7CcAwEATRVsTFAu19NlEr4vpvwzhQYPuMnQ6PWUsMFgMc6k1tAtMg/Vkdkr3SrHTcdDT1s+Py3lXfNP9olpofWlFqk8wD", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Start = _t, End = _t]),

  custom = Table.AddColumn(source, "Inner", each source),
  customTypes = Table.TransformColumnTypes(custom, {{"Start", type datetime}, {"End", type datetime}}),
  sourceMultiply = Table.ExpandTableColumn(customTypes, "Inner", {"Start", "End"}, {"Inner.Start", "Inner.End"}),
  sourceMultiplyClean = Table.TransformColumnTypes(sourceMultiply, {{"Inner.Start", type datetime}, {"Inner.End", type datetime}}),
  reorder = Table.ReorderColumns(sourceMultiplyClean, {"Start", "Inner.Start", "End", "Inner.End"}),

  // Delete self-multiplied rows
  reorderClean = Table.AddColumn(reorder, "IsItself", each if ([Start] = [Inner.Start] and [End] = [Inner.End]) then true else false),
  reorderCleanFiltered = Table.SelectRows(reorderClean, each ([IsItself] = false)),

  // functions for minDate, maxDate
  minDate = (date1, date2)=> if date1 < date2 then date1 else date2,
  maxDate = (date1, date2)=> if date1 >= date2 then date1 else date2,

  // Columns for minEnd and maxStart
  minEnd = Table.AddColumn(reorderCleanFiltered,"MinEnd",  each  minDate([End],[Inner.End]), type datetime),
  maxStart = Table.AddColumn(minEnd, "MaxStart", each maxDate([Start], [Inner.Start]), type datetime ),  
  
  // Calculate time difference
  durationColumn = Table.AddColumn(maxStart, "Duration", each [MinEnd] - [MaxStart], type duration),
  durationGreaterZero = Table.SelectRows(durationColumn, each [Duration] > #duration(0, 0, 0, 0)),

  // delete doubles
  keepDistinct = Table.Distinct(durationGreaterZero, {"MinEnd","MaxStart"})  

in
 keepDistinct

 

 

You could then first calculate your durations between faults per day and then substract the overlaps for each day.

 

1. Add a column that holds the full source table.

2. Expand "Inner.Start" and "Inner.End"

3. Find the minimum end of each event.

4. Find the maximum start of each event.

5. Calculate duration

6. No overlap -> negative duration -> delete rows

6. Delete where the row multiplied by itself.

7. The overlap time will be produced twice, so keep only distinct.

Example as follows:

Ibanez2000_2-1718939957328.png

 

I am not completely sure if it catches all scenarios. Please test extensively.

 

 

lbendlin
Super User
Super User

Please provide sample data that covers your issue or question completely, in a usable format (not as a screenshot).

Do not include sensitive information or anything not related to the issue or question.

If you are unsure how to upload data please refer to https://community.fabric.microsoft.com/t5/Community-Blog/How-to-provide-sample-data-in-the-Power-BI-...

Please show the expected outcome based on the sample data you provided.

Want faster answers? https://community.fabric.microsoft.com/t5/Desktop/How-to-Get-Your-Question-Answered-Quickly/m-p/1447...

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.