Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
caseski
Helper I
Helper I

Calculate Energy consumption from power reading every two seconds

Dear all, in my database the power (kw) of electrical energy of a system is recorded every two seconds.
Over the course of 24 hours there are data different from zero (when the system is on) and data equal to zero (when the system is off). The number of zero and non-zero registrations over a 24-hour period is unknown. I would like to calculate the total daily consumption in kwh of the system. What function should I use. One way would be to calculate the average daily power (average excluding zeros) and multiply it by the operating time. But I do not know how I can do it.
Here is an example of table data

 

 

TimeColDL_09_POWER_MILL

30/01/2024 11:36:540
30/01/2024 11:36:560
30/01/2024 11:36:580
30/01/2024 11:37:000
30/01/2024 11:37:0265,200233
30/01/2024 11:37:0461,405251
30/01/2024 11:37:0664,022934
30/01/2024 11:37:0853,313046
30/01/2024 11:37:1060,697857
30/01/2024 11:37:1260,000023
30/01/2024 11:37:1465,124931
30/01/2024 11:37:1673,41111
30/01/2024 11:37:1870,811569
30/01/2024 11:37:2068,201675
30/01/2024 11:37:2272,599777
30/01/2024 11:37:2471,482063
30/01/2024 11:37:2675,42907
30/01/2024 11:37:2879,3694
30/01/2024 11:37:3073,902763
30/01/2024 11:37:3279,199295
30/01/2024 11:37:3495,813103
30/01/2024 11:37:3697,644234
30/01/2024 11:37:3898,688683
30/01/2024 11:37:40103,59272
30/01/2024 11:37:42104,099937
30/01/2024 11:37:44117,499672
30/01/2024 11:37:46122,052467
30/01/2024 11:37:48103,579269
30/01/2024 11:37:50123,074402
30/01/2024 11:37:52110,203918
30/01/2024 11:37:54126,499588
30/01/2024 11:37:56130,512711
30/01/2024 11:37:58127,391762
30/01/2024 11:38:00129,081329
30/01/2024 11:38:02127,600601
30/01/2024 11:38:04124,400398
30/01/2024 11:38:06145,819138
30/01/2024 11:38:08150,470245
30/01/2024 11:38:10134,449738
30/01/2024 11:38:12156,357635
30/01/2024 11:38:14141,101379
30/01/2024 11:38:16161,199478
30/01/2024 11:38:18154,087448
30/01/2024 11:38:20147,206772
30/01/2024 11:38:22144,108383
30/01/2024 11:38:24166,884338
30/01/2024 11:38:26167,599915
30/01/2024 11:38:28166,545547
30/01/2024 11:38:30169,162933
30/01/2024 11:38:32160,730499
30/01/2024 11:38:34176,482681
30/01/2024 11:38:36167,80043
30/01/2024 11:38:38171,299866
30/01/2024 11:38:40182,504654
30/01/2024 11:38:42171,624756
30/01/2024 11:38:44172,896622
30/01/2024 11:38:46178,596222
30/01/2024 11:38:48177,300217
30/01/2024 11:38:50170,300095
30/01/2024 11:38:52167,383911
30/01/2024 11:38:54171,878372
30/01/2024 11:38:56180,092758
30/01/2024 11:38:58169,902496
30/01/2024 11:39:00173,499878
30/01/2024 11:39:02168,045303
30/01/2024 11:39:04170,145218
30/01/2024 11:39:06168,106323
30/01/2024 11:39:08176,39859
30/01/2024 11:39:10172,00032
30/01/2024 11:39:12170,172318
30/01/2024 11:39:14159,325821
30/01/2024 11:39:16159,496155
30/01/2024 11:39:18176,484024
1 ACCEPTED SOLUTION
collinsg
Super User
Super User

Good day caseki,

If the power reading is p at the end of a 2 second period you may assume that p is a reasonable approximation to the power throughout the prior 2 seconds. If p is in kW, the energy expended in those 2 seconds is p*(2/3600) kWh (as 2 seconds is 2/3600th of an hour).

 

The following steps will give you the energy per day...essentially you integrate the area under the chart of v time.

  1. Add an index column, starting at 0.
  2. Calculate the energy consumed in each period.
    1. Calculate the length of the period in hours (this is where the index column comes in - you take the timestamp in the current row and subtract the timestamp in the row with index of one less). Do not assume all periods are 2 seconds as in the real world readings get missed. I discard the first row as it does not have a prior row - this hardly matters in the course of a day.
    2. Multiply by the power
  3. Transform the timestamp column to date only.
  4. Group by date, summing energy.

If you have days and days of energy this could be an expensive calculation. In that circumstance you could try out a rougher approximation where you group by minute and take an average (mean or median) power for the minute and then integrate from there. This would reduce the number of rows by a factor of 30.

Hope this helps.

 

The following code uses your data as a starting point.

let
Source = Table.FromRows(
Json.Document(
Binary.Decompress(
Binary.FromText(
"fZZJbm0hDES38pUxUtzhJluJsv9t/KLJDOcNLyVTmOL4fX9/KH0SfwqJ/WP+Uv+a9jE+6ONnvNb8j7Xs1uKL6I81wZrPIUSi2miWJ+dhNGVyo1ne3AaJlFqjWR6nDmUl87eGl1en4RU5o9HI0RAt043GzrlYrLTxzMtz6DDGr5Esy0EjmafXWyPbcqKF7DEbzbIcMmZVRHMsWZYDbU4hb44l2/IcJkVdmW25hno196B0zl0k0W2kcqpwlVRzKF2Ga6I5ytTVWYYrhptJlwtdjiuHZ3o2dWx5xi7ooIQ0GtkahLCqtGmPLdPMMazK20rLNYsMJN68q5S/lqKkC8fcvkUHhRk1281tnAkR0kLk3qJtXHwZn9mJtnGlMVmiC/VmBdYHNgt/esoDDZYahOuV5+ny0GNVciKn53Z58IF3CH7geE/jefjBtsJUrJ1oG580LPDxmco8BGHFdlbRVdoIwZMeOvEEukrbuPFgYo2mBRsiDDripVh02x3jSGYiBo1oc4QtkAKPdzLzgITN4Cn1/VjykITdR6Zp1wI5xmMxqbhpwYbJqjRtTns+hDw8YQcsHPBvPG2gMKgdgH81zdxEQSYXAz2bPOmv8SSybrftGzCVqvTnrMnLlASVMY7mE055oYJKLhazq3R8y8hyl+bqDlQi0XBoOtExjreJ2cZNww9UMJYgojed80IFbUJM6o2CvFDB6TJSu8wdqCQGLtg7mzgdqCAEmCpWzz7VhcoaubiW92OpCxUMVLKp77lSFypoAZghb2TWhQoqMQbq+59CXaggc2DTfOayLlNwv+i3PrtUFynLUYh2jg5SJqazzJTnpdRFCkToI8/n9dZFyn4qmCsI789/",
BinaryEncoding.Base64
),
Compression.Deflate
)
),
let
_t = ((type nullable text) meta [Serialized.Text = true])
in
type table [TimeColDL_09_POWER_MILL = _t, #"(blank)" = _t]
),
#"Renamed Columns" = Table.RenameColumns(
Source, {{"(blank)", "Power"}, {"TimeColDL_09_POWER_MILL", "Timestamp"}}
),
#"Changed Type" = Table.TransformColumnTypes(
#"Renamed Columns", {{"Timestamp", type datetime}, {"Power", Int64.Type}}
),

//These are the important steps...

#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Energy" = Table.AddColumn(
#"Added Index",
"Energy",
each Duration.TotalHours([Timestamp] - #"Added Index"{List.Max({[Index] - 1, 0})}[Timestamp]) * [Power], type number),
#"Extracted Date" = Table.TransformColumns(#"Added Energy", {{"Timestamp", DateTime.Date, type date}}),
#"Grouped Rows" = Table.Group(
#"Extracted Date", {"Timestamp"}, {{"Energy for Day", each List.Sum([Energy]), type number}} )
in
#"Grouped Rows"

 

View solution in original post

2 REPLIES 2
collinsg
Super User
Super User

Good day caseki,

If the power reading is p at the end of a 2 second period you may assume that p is a reasonable approximation to the power throughout the prior 2 seconds. If p is in kW, the energy expended in those 2 seconds is p*(2/3600) kWh (as 2 seconds is 2/3600th of an hour).

 

The following steps will give you the energy per day...essentially you integrate the area under the chart of v time.

  1. Add an index column, starting at 0.
  2. Calculate the energy consumed in each period.
    1. Calculate the length of the period in hours (this is where the index column comes in - you take the timestamp in the current row and subtract the timestamp in the row with index of one less). Do not assume all periods are 2 seconds as in the real world readings get missed. I discard the first row as it does not have a prior row - this hardly matters in the course of a day.
    2. Multiply by the power
  3. Transform the timestamp column to date only.
  4. Group by date, summing energy.

If you have days and days of energy this could be an expensive calculation. In that circumstance you could try out a rougher approximation where you group by minute and take an average (mean or median) power for the minute and then integrate from there. This would reduce the number of rows by a factor of 30.

Hope this helps.

 

The following code uses your data as a starting point.

let
Source = Table.FromRows(
Json.Document(
Binary.Decompress(
Binary.FromText(
"fZZJbm0hDES38pUxUtzhJluJsv9t/KLJDOcNLyVTmOL4fX9/KH0SfwqJ/WP+Uv+a9jE+6ONnvNb8j7Xs1uKL6I81wZrPIUSi2miWJ+dhNGVyo1ne3AaJlFqjWR6nDmUl87eGl1en4RU5o9HI0RAt043GzrlYrLTxzMtz6DDGr5Esy0EjmafXWyPbcqKF7DEbzbIcMmZVRHMsWZYDbU4hb44l2/IcJkVdmW25hno196B0zl0k0W2kcqpwlVRzKF2Ga6I5ytTVWYYrhptJlwtdjiuHZ3o2dWx5xi7ooIQ0GtkahLCqtGmPLdPMMazK20rLNYsMJN68q5S/lqKkC8fcvkUHhRk1281tnAkR0kLk3qJtXHwZn9mJtnGlMVmiC/VmBdYHNgt/esoDDZYahOuV5+ny0GNVciKn53Z58IF3CH7geE/jefjBtsJUrJ1oG580LPDxmco8BGHFdlbRVdoIwZMeOvEEukrbuPFgYo2mBRsiDDripVh02x3jSGYiBo1oc4QtkAKPdzLzgITN4Cn1/VjykITdR6Zp1wI5xmMxqbhpwYbJqjRtTns+hDw8YQcsHPBvPG2gMKgdgH81zdxEQSYXAz2bPOmv8SSybrftGzCVqvTnrMnLlASVMY7mE055oYJKLhazq3R8y8hyl+bqDlQi0XBoOtExjreJ2cZNww9UMJYgojed80IFbUJM6o2CvFDB6TJSu8wdqCQGLtg7mzgdqCAEmCpWzz7VhcoaubiW92OpCxUMVLKp77lSFypoAZghb2TWhQoqMQbq+59CXaggc2DTfOayLlNwv+i3PrtUFynLUYh2jg5SJqazzJTnpdRFCkToI8/n9dZFyn4qmCsI789/",
BinaryEncoding.Base64
),
Compression.Deflate
)
),
let
_t = ((type nullable text) meta [Serialized.Text = true])
in
type table [TimeColDL_09_POWER_MILL = _t, #"(blank)" = _t]
),
#"Renamed Columns" = Table.RenameColumns(
Source, {{"(blank)", "Power"}, {"TimeColDL_09_POWER_MILL", "Timestamp"}}
),
#"Changed Type" = Table.TransformColumnTypes(
#"Renamed Columns", {{"Timestamp", type datetime}, {"Power", Int64.Type}}
),

//These are the important steps...

#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Energy" = Table.AddColumn(
#"Added Index",
"Energy",
each Duration.TotalHours([Timestamp] - #"Added Index"{List.Max({[Index] - 1, 0})}[Timestamp]) * [Power], type number),
#"Extracted Date" = Table.TransformColumns(#"Added Energy", {{"Timestamp", DateTime.Date, type date}}),
#"Grouped Rows" = Table.Group(
#"Extracted Date", {"Timestamp"}, {{"Energy for Day", each List.Sum([Energy]), type number}} )
in
#"Grouped Rows"

 

dufoq3
Super User
Super User

Hi @caseski,

 

I consider power written every 2 seconds as kW per hour so I used daily logic calculation as:

  1. calculated avg power
  2. multiplied by 24 to calculate daily power
  3. You will see every day separately

dufoq3_1-1706640199752.png

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZZJbm0hDES38pUxUtzhJluJsv9t/KLJDOcNLyVTmOL4fX9/KH0SfwqJ/WP+Uv+a9jE+6ONnvNb8j7Xs1uKL6I81wZrPIUSi2miWJ+dhNGVyo1ne3AaJlFqjWR6nDmUl87eGl1en4RU5o9HI0RAt043GzrlYrLTxzMtz6DDGr5Esy0EjmafXWyPbcqKF7DEbzbIcMmZVRHMsWZYDbU4hb44l2/IcJkVdmW25hno196B0zl0k0W2kcqpwlVRzKF2Ga6I5ytTVWYYrhptJlwtdjiuHZ3o2dWx5xi7ooIQ0GtkahLCqtGmPLdPMMazK20rLNYsMJN68q5S/lqKkC8fcvkUHhRk1281tnAkR0kLk3qJtXHwZn9mJtnGlMVmiC/VmBdYHNgt/esoDDZYahOuV5+ny0GNVciKn53Z58IF3CH7geE/jefjBtsJUrJ1oG580LPDxmco8BGHFdlbRVdoIwZMeOvEEukrbuPFgYo2mBRsiDDripVh02x3jSGYiBo1oc4QtkAKPdzLzgITN4Cn1/VjykITdR6Zp1wI5xmMxqbhpwYbJqjRtTns+hDw8YQcsHPBvPG2gMKgdgH81zdxEQSYXAz2bPOmv8SSybrftGzCVqvTnrMnLlASVMY7mE055oYJKLhazq3R8y8hyl+bqDlQi0XBoOtExjreJ2cZNww9UMJYgojed80IFbUJM6o2CvFDB6TJSu8wdqCQGLtg7mzgdqCAEmCpWzz7VhcoaubiW92OpCxUMVLKp77lSFypoAZghb2TWhQoqMQbq+59CXaggc2DTfOayLlNwv+i3PrtUFynLUYh2jg5SJqazzJTnpdRFCkToI8/n9dZFyn4qmCsI789/", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [TimeColDL_09 = _t, Power = _t]),
    #"Filtered Rows" = Table.SelectRows(Source, each ([Power] <> "0" and [Power] <> "" and [Power] <> null)),
    #"Changed Type" = Table.TransformColumnTypes(#"Filtered Rows",{{"TimeColDL_09", type datetime}, {"Power", type number}}, "sk-SK"),
    #"Inserted Date" = Table.AddColumn(#"Changed Type", "Date", each DateTime.Date([TimeColDL_09]), type date),
    #"Grouped Rows" = Table.Group(#"Inserted Date", {"Date"}, {{"Avg Daily Power", each List.Average([Power]) * 24, type number}, {"Daily Detail", each _, type table }})
in
    #"Grouped Rows"

 

 

Let me know if this is what you need.


Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

Helpful resources

Announcements
September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.