Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
leo0403
Frequent Visitor

Overlapping Alarms - How to calculate the total time of downtime?

Hello, it is my first time in here and I would be very grateful if someone can help me with this task...

 

I have an extract of 3 days of an log alarms of different machines, each log line has its start and end times columns filled. It's very common to have some alarms overlaping other in many ways (complete overlap, partial overlap or no overlap).

 

I need to threat this data to resume the total time of downtime (with no overlaps, I also saw the term "saturation" in other posts) per Machine per Day:

 

TypeAEGAlarmCodeIndexStartTimeEndTime
ERRORMachine 12395352023-11-01T14:24:43.62100002023-11-01T14:26:33.8710000
ERRORMachine 2814922023-11-02T08:04:37.24800002023-11-02T08:10:45.5140000
ERRORMachine 2224932023-11-02T08:04:42.98200002023-11-02T08:10:45.5140000
ERRORMachine 214054862023-11-02T08:04:45.47600002023-11-02T08:10:47.5090000
ERRORMachine 2224752023-11-02T08:22:02.52100002023-11-02T08:22:23.7340000
ERRORMachine 214054692023-11-02T08:22:03.76700002023-11-02T08:25:33.2080000
ERRORMachine 2224732023-11-02T08:22:23.96300002023-11-02T08:25:32.7090000
ERRORMachine 2814362023-11-02T08:36:05.27000002023-11-02T09:00:49.0560000
ERRORMachine 2223892023-11-02T09:32:13.52000002023-11-02T10:45:16.1370000
ERRORMachine 214053842023-11-02T09:32:17.50900002023-11-02T10:45:16.1370000
ERRORMachine 2813882023-11-02T09:32:26.23600002023-11-02T10:45:16.1370000
INFOMachine 210000243832023-11-02T09:52:47.22200002023-11-02T10:45:17.8830000
ERRORMachine 31013562023-11-02T11:47:39.39400002023-11-02T12:46:25.8990000
INFOMachine 310000243592023-11-02T11:59:04.81500002023-11-02T12:44:09.4330000
ERRORMachine 41011132023-11-03T14:23:20.80900002023-11-03T16:21:58.1280000
ERRORMachine 4221172023-11-03T14:52:21.45700002023-11-03T16:21:58.1280000
ERRORMachine 4231162023-11-03T14:52:21.45700002023-11-03T16:21:58.1280000
ERRORMachine 4281142023-11-03T14:52:21.45700002023-11-03T16:21:58.1280000
ERRORMachine 4271152023-11-03T14:52:21.45700002023-11-03T16:21:58.1280000
ERRORMachine 43022992023-11-03T14:52:25.69600002023-11-03T16:21:58.1280000
INFOMachine 41000024922023-11-03T16:15:12.78900002023-11-03T16:21:58.7270000
ERRORMachine 433892023-11-03T16:22:04.46100002023-11-03T16:22:08.7020000
ERRORMachine 433842023-11-03T16:22:10.44600002023-11-03T16:22:13.4380000
ERRORMachine 433802023-11-03T16:22:16.43000002023-11-03T16:22:18.4270000
ERRORMachine 433742023-11-03T16:22:22.41400002023-11-03T16:24:00.6540000
INFOMachine 41000024732023-11-03T16:23:47.92800002023-11-03T16:24:00.8940000

    

The expected output is a table like below using Power Query (but also accept other suggestions):

 

leo0403_0-1700708903246.png

 

Thank you!

2 ACCEPTED SOLUTIONS

Here is an approach proposed by @jpessoa8 

 

Hours = 
VAR d = SELECTCOLUMNS(
		ADDCOLUMNS(
			CROSSJOIN('Dates',GENERATESERIES(0,1439)),
			"ms", 'Dates'[Date] * 1440 + [Value]
		),
		"Value",[ms])
	VAR p = SELECTCOLUMNS(
		GENERATE(
			VALUES('Outages'[StartTime]),
			VAR EndValue = CALCULATE(MAX('Outages'[EndTime]))
			RETURN GENERATESERIES(INT('Outages'[StartTime] * 1440),	EndValue * 1440 - 1)	),
		"Value",[Value]	)
	RETURN
		COUNTROWS(INTERSECT(d,p))/60

lbendlin_0-1700946717644.png

see attached

 

View solution in original post

ronrsnfld
Super User
Super User

And here is a Power Query (M-Code) solution with Output in whole minutes:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("tZZNaxwxDIb/StlzIqwvW/K9hR7awJJb2EMIgfSS/3+sxptss7YnO23ZOQwG4Ucvrz5mHh52X/f7u/3uZvfj8enl1+vzF4wzscdbWZdzIr5FvE14j1JJqjBkwhTPGM2VGawco4ebkU5xtiWFOH28TvfJapLKBUish7copioKivIpnKjBeQYXAjf6D3iEF0vE8hSvICWv4gto8g3aiw7XiWoi0NH19ygxFN6mPfsUH4Bc5nhdikrJtmgffT+q88yrcIJywZhjx/DoOueaFKgJ7+FeU7jukDRfVs7W2+IhrCKH6xN4a5eKGZDLFtfZZIr/0xP/hm/GsNkMThmIJ+04hX//+e2ul74ESVqCvqxelZaOJpqM01uCAma8qp5bhiZfu7oiBrqyA7tM6JE4R9+Aua/J53P56kMCje4QMNR5AqnJQXhdvpzkI56Zw20PcqUENpY2oqE90hsgrQ+UvLclYhng4TwhiA7T+jfw5hDm68CtweU68NLgehU4p+a6+5SukH2YpxV615By1pDnn75GwBiY2IP2WccUWl8GTf1SVOu157beBSQPX49TNNiJNrD7krbbmEBkxZe2QYUvuN7Y89s5bg/79xQ1kC2elKluIhAcFswxGvOfIKtsq2fpN0BeNkCsR6fhV+YD397W2+HwGw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Type = _t, AEG = _t, AlarmCode = _t, Index = _t, StartTime = _t, EndTime = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{
        {"Type", type text}, {"AEG", type text}, {"AlarmCode", Int64.Type}, {"Index", Int64.Type}, 
        {"StartTime", type datetime}, {"EndTime", type datetime}}),
    #"Round to Minute" = Table.TransformColumns(#"Changed Type",{
        {"StartTime", each DateTime.From(Number.Round(Number.From(_)*1440,0)/1440), type datetime},
        {"EndTime", each DateTime.From(Number.Round(Number.From(_)*1440,0)/1440), type datetime}}),
    #"Add Minutes List" = Table.AddColumn(#"Round to Minute","Minutes", 
        each List.DateTimes([StartTime], Duration.TotalMinutes([EndTime]-[StartTime]),#duration(0,0,1,0))),

    #"Grouped Rows" = Table.Group(#"Add Minutes List", {"AEG"}, {
        {"all", each _, 
            type table [Type=nullable text, AEG=nullable text, AlarmCode=nullable number, 
                        Index=nullable number, StartTime=datetime, EndTime=datetime, Minutes=list]}}),
    
//Add distinct list of all minutes column,
//  change to just dates so we can count minuts per date
//  create table and pivot
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Minutes", each 
        let 
            datesList = 
                List.Transform(
                    List.Distinct(
                        List.Combine([all][Minutes])),
                            each Date.From(_)),
            tbl = Table.FromColumns({datesList},{"Dates"}),
            #"Dates to Text" = Table.TransformColumns(tbl,{"Dates", each Date.ToText(_,"yyyy-MM-dd")}),
            addCounter = Table.AddColumn(#"Dates to Text","Counter", each 1, Int64.Type), 
            pivot = Table.Pivot(addCounter, List.Distinct(addCounter[Dates]),"Dates","Counter",List.Sum)
        in 
          pivot),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"all"}),
    #"ColumnNames in Pivot" = List.Distinct(List.Combine(List.Transform(#"Removed Columns"[Minutes], each Table.ColumnNames(_)))),
    #"Expand Table" = Table.ExpandTableColumn(#"Removed Columns","Minutes",#"ColumnNames in Pivot")
in
    #"Expand Table"

 

ronrsnfld_0-1701001692586.png

 

 

View solution in original post

12 REPLIES 12
ronrsnfld
Super User
Super User

And here is a Power Query (M-Code) solution with Output in whole minutes:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("tZZNaxwxDIb/StlzIqwvW/K9hR7awJJb2EMIgfSS/3+sxptss7YnO23ZOQwG4Ucvrz5mHh52X/f7u/3uZvfj8enl1+vzF4wzscdbWZdzIr5FvE14j1JJqjBkwhTPGM2VGawco4ebkU5xtiWFOH28TvfJapLKBUish7copioKivIpnKjBeQYXAjf6D3iEF0vE8hSvICWv4gto8g3aiw7XiWoi0NH19ygxFN6mPfsUH4Bc5nhdikrJtmgffT+q88yrcIJywZhjx/DoOueaFKgJ7+FeU7jukDRfVs7W2+IhrCKH6xN4a5eKGZDLFtfZZIr/0xP/hm/GsNkMThmIJ+04hX//+e2ul74ESVqCvqxelZaOJpqM01uCAma8qp5bhiZfu7oiBrqyA7tM6JE4R9+Aua/J53P56kMCje4QMNR5AqnJQXhdvpzkI56Zw20PcqUENpY2oqE90hsgrQ+UvLclYhng4TwhiA7T+jfw5hDm68CtweU68NLgehU4p+a6+5SukH2YpxV615By1pDnn75GwBiY2IP2WccUWl8GTf1SVOu157beBSQPX49TNNiJNrD7krbbmEBkxZe2QYUvuN7Y89s5bg/79xQ1kC2elKluIhAcFswxGvOfIKtsq2fpN0BeNkCsR6fhV+YD397W2+HwGw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Type = _t, AEG = _t, AlarmCode = _t, Index = _t, StartTime = _t, EndTime = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{
        {"Type", type text}, {"AEG", type text}, {"AlarmCode", Int64.Type}, {"Index", Int64.Type}, 
        {"StartTime", type datetime}, {"EndTime", type datetime}}),
    #"Round to Minute" = Table.TransformColumns(#"Changed Type",{
        {"StartTime", each DateTime.From(Number.Round(Number.From(_)*1440,0)/1440), type datetime},
        {"EndTime", each DateTime.From(Number.Round(Number.From(_)*1440,0)/1440), type datetime}}),
    #"Add Minutes List" = Table.AddColumn(#"Round to Minute","Minutes", 
        each List.DateTimes([StartTime], Duration.TotalMinutes([EndTime]-[StartTime]),#duration(0,0,1,0))),

    #"Grouped Rows" = Table.Group(#"Add Minutes List", {"AEG"}, {
        {"all", each _, 
            type table [Type=nullable text, AEG=nullable text, AlarmCode=nullable number, 
                        Index=nullable number, StartTime=datetime, EndTime=datetime, Minutes=list]}}),
    
//Add distinct list of all minutes column,
//  change to just dates so we can count minuts per date
//  create table and pivot
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Minutes", each 
        let 
            datesList = 
                List.Transform(
                    List.Distinct(
                        List.Combine([all][Minutes])),
                            each Date.From(_)),
            tbl = Table.FromColumns({datesList},{"Dates"}),
            #"Dates to Text" = Table.TransformColumns(tbl,{"Dates", each Date.ToText(_,"yyyy-MM-dd")}),
            addCounter = Table.AddColumn(#"Dates to Text","Counter", each 1, Int64.Type), 
            pivot = Table.Pivot(addCounter, List.Distinct(addCounter[Dates]),"Dates","Counter",List.Sum)
        in 
          pivot),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"all"}),
    #"ColumnNames in Pivot" = List.Distinct(List.Combine(List.Transform(#"Removed Columns"[Minutes], each Table.ColumnNames(_)))),
    #"Expand Table" = Table.ExpandTableColumn(#"Removed Columns","Minutes",#"ColumnNames in Pivot")
in
    #"Expand Table"

 

ronrsnfld_0-1701001692586.png

 

 

You can refactor that to 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("tZZNaxwxDIb/StlzIqwvW/K9hR7awJJb2EMIgfSS/3+sxptss7YnO23ZOQwG4Ucvrz5mHh52X/f7u/3uZvfj8enl1+vzF4wzscdbWZdzIr5FvE14j1JJqjBkwhTPGM2VGawco4ebkU5xtiWFOH28TvfJapLKBUish7copioKivIpnKjBeQYXAjf6D3iEF0vE8hSvICWv4gto8g3aiw7XiWoi0NH19ygxFN6mPfsUH4Bc5nhdikrJtmgffT+q88yrcIJywZhjx/DoOueaFKgJ7+FeU7jukDRfVs7W2+IhrCKH6xN4a5eKGZDLFtfZZIr/0xP/hm/GsNkMThmIJ+04hX//+e2ul74ESVqCvqxelZaOJpqM01uCAma8qp5bhiZfu7oiBrqyA7tM6JE4R9+Aua/J53P56kMCje4QMNR5AqnJQXhdvpzkI56Zw20PcqUENpY2oqE90hsgrQ+UvLclYhng4TwhiA7T+jfw5hDm68CtweU68NLgehU4p+a6+5SukH2YpxV615By1pDnn75GwBiY2IP2WccUWl8GTf1SVOu157beBSQPX49TNNiJNrD7krbbmEBkxZe2QYUvuN7Y89s5bg/79xQ1kC2elKluIhAcFswxGvOfIKtsq2fpN0BeNkCsR6fhV+YD397W2+HwGw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Type = _t, AEG = _t, AlarmCode = _t, Index = _t, StartTime = _t, EndTime = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Type", type text}, {"AEG", type text}, {"AlarmCode", Int64.Type}, {"Index", Int64.Type}, {"StartTime", type datetime}, {"EndTime", type datetime}}),
    #"Start Date" = Table.AddColumn(#"Changed Type", "Date", each Date.From([StartTime]),type date),
    Interval = Table.AddColumn(#"Start Date", "Interval", each { Int64.From(Number.From([StartTime])*1440)..Int64.From(Number.From([EndTime])*1440-1)}),
    #"Grouped Rows" = Table.Group(Interval, {"AEG", "Date"}, {{"Rows", each _, type table [Type=nullable text, AEG=nullable text, AlarmCode=nullable number, Index=nullable number, StartTime=nullable datetime, EndTime=nullable datetime, Date=date, Interval=list]}}),
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Minutes", each List.Count(List.Distinct(List.Combine([Rows][Interval]))),Int64.Type)
in
    #"Added Custom"

That sort of works also, but it seems that if the alarm time is > 24 hours, all of the time is allocated to the first date.

You are correct, it would also be an issue if the outage would just stretch over two days.  Here's the adjusted code.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("tZZNaxwxDIb/StlzIqwvW/K9hR7awJJb2EMIgfSS/3+sxptss7YnO23ZOQwG4Ucvrz5mHh52X/f7u/3uZvfj8enl1+vzF4wzscdbWZdzIr5FvE14j1JJqjBkwhTPGM2VGawco4ebkU5xtiWFOH28TvfJapLKBUish7copioKivIpnKjBeQYXAjf6D3iEF0vE8hSvICWv4gto8g3aiw7XiWoi0NH19ygxFN6mPfsUH4Bc5nhdikrJtmgffT+q88yrcIJywZhjx/DoOueaFKgJ7+FeU7jukDRfVs7W2+IhrCKH6xN4a5eKGZDLFtfZZIr/0xP/hm/GsNkMThmIJ+04hX//+e2ul74ESVqCvqxelZaOJpqM01uCAma8qp5bhiZfu7oiBrqyA7tM6JE4R9+Aua/J53P56kMCje4QMNR5AqnJQXhdvpzkI56Zw20PcqUENpY2oqE90hsgrQ+UvLclYhng4TwhiA7T+jfw5hDm68CtweU68NLgehU4p+a6+5SukH2YpxV615By1pDnn75GwBiY2IP2WccUWl8GTf1SVOu157beBSQPX49TNNiJNrD7krbbmEBkxZe2QYUvuN7Y89s5bg/79xQ1kC2elKluIhAcFswxGvOfIKtsq2fpN0BeNkCsR6fhV+YD397W2+HwGw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Type = _t, AEG = _t, AlarmCode = _t, Index = _t, StartTime = _t, EndTime = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"StartTime", type datetime}, {"EndTime", type datetime}}),
    #"Outage Interval" = Table.AddColumn(#"Changed Type", "Interval", each { Int64.From(Number.From([StartTime])*1440)..Int64.From(Number.From([EndTime])*1440-1)}),
    #"Grouped Rows by Machine" = Table.Group(#"Outage Interval", {"AEG"}, {{"Rows", each List.Distinct(List.Combine(_[Interval])), type table [ Interval=list]}}),
    #"Added Dates" = Table.AddColumn(#"Grouped Rows by Machine", "Date", each {Number.RoundDown(Number.From(List.Min(#"Changed Type"[StartTime])),0)..Number.RoundDown(Number.From(List.Max(#"Changed Type"[EndTime])),0)}),
    #"Expanded Dates" = Table.ExpandListColumn(#"Added Dates", "Date"),
    #"Date Interval" = Table.AddColumn(#"Expanded Dates", "DateInterval", each {[Date]*1440..([Date]+1)*1440-1}),
    #"Added Custom" = Table.AddColumn(#"Date Interval", "Minutes", each List.Count(List.Intersect({[Rows],[DateInterval]})),Int64.Type),
    #"Changed Type1" = Table.TransformColumnTypes(#"Added Custom",{{"Date", type date}}),
    #"Removed Other Columns" = Table.SelectColumns(#"Changed Type1",{"AEG", "Date", "Minutes"})
in
    #"Removed Other Columns"

 

That does return the same results as mine. What do you think the advantages/disadvantages are of the two approaches?

You would have to throw both at a substantially larger amount of data and see how they perform.  The date range mask for example could be customized to only cover the min/max dates for each machine rather than the entire range.

 

Keep in mind that Power Query runs on disk, so it will nearly always be substantially slower than DAX (which runs in memory).  A List.Buffer here and there might mitigate that.

@lbendlin Oh, I wasn't asking about a comparison with DAX; rather I was asking about a comparison with your M method.

You can use Query diagnostics to compare them, but the real test is at volume. Can they handle millions of rows and hundreds of dates?

lbendlin
Super User
Super User

here's a graphical approach using Deneb

 

lbendlin_0-1700794508685.png

 

Syndicate_Admin
Administrator
Administrator

your data is too detailed.  You have millisecond level timestamps that would result in 86400000 rows per day for a mapping table.  Can you reduce the granularity to second or minute level?

Hi admin! Thanks for helping.

 

Yeah, minute level is great, once I'm going to have many other entries.

Here is an approach proposed by @jpessoa8 

 

Hours = 
VAR d = SELECTCOLUMNS(
		ADDCOLUMNS(
			CROSSJOIN('Dates',GENERATESERIES(0,1439)),
			"ms", 'Dates'[Date] * 1440 + [Value]
		),
		"Value",[ms])
	VAR p = SELECTCOLUMNS(
		GENERATE(
			VALUES('Outages'[StartTime]),
			VAR EndValue = CALCULATE(MAX('Outages'[EndTime]))
			RETURN GENERATESERIES(INT('Outages'[StartTime] * 1440),	EndValue * 1440 - 1)	),
		"Value",[Value]	)
	RETURN
		COUNTROWS(INTERSECT(d,p))/60

lbendlin_0-1700946717644.png

see attached

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

Jan25PBI_Carousel

Power BI Monthly Update - January 2025

Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.

Jan NL Carousel

Fabric Community Update - January 2025

Find out what's new and trending in the Fabric community.