Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Dear all, in my database the power (kw) of electrical energy of a system is recorded every two seconds.
Over the course of 24 hours there are data different from zero (when the system is on) and data equal to zero (when the system is off). The number of zero and non-zero registrations over a 24-hour period is unknown. I would like to calculate the total daily consumption in kwh of the system. What function should I use. One way would be to calculate the average daily power (average excluding zeros) and multiply it by the operating time. But I do not know how I can do it.
Here is an example of table data
TimeColDL_09_POWER_MILL
30/01/2024 11:36:54 | 0 |
30/01/2024 11:36:56 | 0 |
30/01/2024 11:36:58 | 0 |
30/01/2024 11:37:00 | 0 |
30/01/2024 11:37:02 | 65,200233 |
30/01/2024 11:37:04 | 61,405251 |
30/01/2024 11:37:06 | 64,022934 |
30/01/2024 11:37:08 | 53,313046 |
30/01/2024 11:37:10 | 60,697857 |
30/01/2024 11:37:12 | 60,000023 |
30/01/2024 11:37:14 | 65,124931 |
30/01/2024 11:37:16 | 73,41111 |
30/01/2024 11:37:18 | 70,811569 |
30/01/2024 11:37:20 | 68,201675 |
30/01/2024 11:37:22 | 72,599777 |
30/01/2024 11:37:24 | 71,482063 |
30/01/2024 11:37:26 | 75,42907 |
30/01/2024 11:37:28 | 79,3694 |
30/01/2024 11:37:30 | 73,902763 |
30/01/2024 11:37:32 | 79,199295 |
30/01/2024 11:37:34 | 95,813103 |
30/01/2024 11:37:36 | 97,644234 |
30/01/2024 11:37:38 | 98,688683 |
30/01/2024 11:37:40 | 103,59272 |
30/01/2024 11:37:42 | 104,099937 |
30/01/2024 11:37:44 | 117,499672 |
30/01/2024 11:37:46 | 122,052467 |
30/01/2024 11:37:48 | 103,579269 |
30/01/2024 11:37:50 | 123,074402 |
30/01/2024 11:37:52 | 110,203918 |
30/01/2024 11:37:54 | 126,499588 |
30/01/2024 11:37:56 | 130,512711 |
30/01/2024 11:37:58 | 127,391762 |
30/01/2024 11:38:00 | 129,081329 |
30/01/2024 11:38:02 | 127,600601 |
30/01/2024 11:38:04 | 124,400398 |
30/01/2024 11:38:06 | 145,819138 |
30/01/2024 11:38:08 | 150,470245 |
30/01/2024 11:38:10 | 134,449738 |
30/01/2024 11:38:12 | 156,357635 |
30/01/2024 11:38:14 | 141,101379 |
30/01/2024 11:38:16 | 161,199478 |
30/01/2024 11:38:18 | 154,087448 |
30/01/2024 11:38:20 | 147,206772 |
30/01/2024 11:38:22 | 144,108383 |
30/01/2024 11:38:24 | 166,884338 |
30/01/2024 11:38:26 | 167,599915 |
30/01/2024 11:38:28 | 166,545547 |
30/01/2024 11:38:30 | 169,162933 |
30/01/2024 11:38:32 | 160,730499 |
30/01/2024 11:38:34 | 176,482681 |
30/01/2024 11:38:36 | 167,80043 |
30/01/2024 11:38:38 | 171,299866 |
30/01/2024 11:38:40 | 182,504654 |
30/01/2024 11:38:42 | 171,624756 |
30/01/2024 11:38:44 | 172,896622 |
30/01/2024 11:38:46 | 178,596222 |
30/01/2024 11:38:48 | 177,300217 |
30/01/2024 11:38:50 | 170,300095 |
30/01/2024 11:38:52 | 167,383911 |
30/01/2024 11:38:54 | 171,878372 |
30/01/2024 11:38:56 | 180,092758 |
30/01/2024 11:38:58 | 169,902496 |
30/01/2024 11:39:00 | 173,499878 |
30/01/2024 11:39:02 | 168,045303 |
30/01/2024 11:39:04 | 170,145218 |
30/01/2024 11:39:06 | 168,106323 |
30/01/2024 11:39:08 | 176,39859 |
30/01/2024 11:39:10 | 172,00032 |
30/01/2024 11:39:12 | 170,172318 |
30/01/2024 11:39:14 | 159,325821 |
30/01/2024 11:39:16 | 159,496155 |
30/01/2024 11:39:18 | 176,484024 |
Solved! Go to Solution.
Good day caseki,
If the power reading is p at the end of a 2 second period you may assume that p is a reasonable approximation to the power throughout the prior 2 seconds. If p is in kW, the energy expended in those 2 seconds is p*(2/3600) kWh (as 2 seconds is 2/3600th of an hour).
The following steps will give you the energy per day...essentially you integrate the area under the chart of p v time.
If you have days and days of energy this could be an expensive calculation. In that circumstance you could try out a rougher approximation where you group by minute and take an average (mean or median) power for the minute and then integrate from there. This would reduce the number of rows by a factor of 30.
Hope this helps.
The following code uses your data as a starting point.
let
Source = Table.FromRows(
Json.Document(
Binary.Decompress(
Binary.FromText(
"fZZJbm0hDES38pUxUtzhJluJsv9t/KLJDOcNLyVTmOL4fX9/KH0SfwqJ/WP+Uv+a9jE+6ONnvNb8j7Xs1uKL6I81wZrPIUSi2miWJ+dhNGVyo1ne3AaJlFqjWR6nDmUl87eGl1en4RU5o9HI0RAt043GzrlYrLTxzMtz6DDGr5Esy0EjmafXWyPbcqKF7DEbzbIcMmZVRHMsWZYDbU4hb44l2/IcJkVdmW25hno196B0zl0k0W2kcqpwlVRzKF2Ga6I5ytTVWYYrhptJlwtdjiuHZ3o2dWx5xi7ooIQ0GtkahLCqtGmPLdPMMazK20rLNYsMJN68q5S/lqKkC8fcvkUHhRk1281tnAkR0kLk3qJtXHwZn9mJtnGlMVmiC/VmBdYHNgt/esoDDZYahOuV5+ny0GNVciKn53Z58IF3CH7geE/jefjBtsJUrJ1oG580LPDxmco8BGHFdlbRVdoIwZMeOvEEukrbuPFgYo2mBRsiDDripVh02x3jSGYiBo1oc4QtkAKPdzLzgITN4Cn1/VjykITdR6Zp1wI5xmMxqbhpwYbJqjRtTns+hDw8YQcsHPBvPG2gMKgdgH81zdxEQSYXAz2bPOmv8SSybrftGzCVqvTnrMnLlASVMY7mE055oYJKLhazq3R8y8hyl+bqDlQi0XBoOtExjreJ2cZNww9UMJYgojed80IFbUJM6o2CvFDB6TJSu8wdqCQGLtg7mzgdqCAEmCpWzz7VhcoaubiW92OpCxUMVLKp77lSFypoAZghb2TWhQoqMQbq+59CXaggc2DTfOayLlNwv+i3PrtUFynLUYh2jg5SJqazzJTnpdRFCkToI8/n9dZFyn4qmCsI789/",
BinaryEncoding.Base64
),
Compression.Deflate
)
),
let
_t = ((type nullable text) meta [Serialized.Text = true])
in
type table [TimeColDL_09_POWER_MILL = _t, #"(blank)" = _t]
),
#"Renamed Columns" = Table.RenameColumns(
Source, {{"(blank)", "Power"}, {"TimeColDL_09_POWER_MILL", "Timestamp"}}
),
#"Changed Type" = Table.TransformColumnTypes(
#"Renamed Columns", {{"Timestamp", type datetime}, {"Power", Int64.Type}}
),
//These are the important steps...
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Energy" = Table.AddColumn(
#"Added Index",
"Energy",
each Duration.TotalHours([Timestamp] - #"Added Index"{List.Max({[Index] - 1, 0})}[Timestamp]) * [Power], type number),
#"Extracted Date" = Table.TransformColumns(#"Added Energy", {{"Timestamp", DateTime.Date, type date}}),
#"Grouped Rows" = Table.Group(
#"Extracted Date", {"Timestamp"}, {{"Energy for Day", each List.Sum([Energy]), type number}} )
in
#"Grouped Rows"
Good day caseki,
If the power reading is p at the end of a 2 second period you may assume that p is a reasonable approximation to the power throughout the prior 2 seconds. If p is in kW, the energy expended in those 2 seconds is p*(2/3600) kWh (as 2 seconds is 2/3600th of an hour).
The following steps will give you the energy per day...essentially you integrate the area under the chart of p v time.
If you have days and days of energy this could be an expensive calculation. In that circumstance you could try out a rougher approximation where you group by minute and take an average (mean or median) power for the minute and then integrate from there. This would reduce the number of rows by a factor of 30.
Hope this helps.
The following code uses your data as a starting point.
let
Source = Table.FromRows(
Json.Document(
Binary.Decompress(
Binary.FromText(
"fZZJbm0hDES38pUxUtzhJluJsv9t/KLJDOcNLyVTmOL4fX9/KH0SfwqJ/WP+Uv+a9jE+6ONnvNb8j7Xs1uKL6I81wZrPIUSi2miWJ+dhNGVyo1ne3AaJlFqjWR6nDmUl87eGl1en4RU5o9HI0RAt043GzrlYrLTxzMtz6DDGr5Esy0EjmafXWyPbcqKF7DEbzbIcMmZVRHMsWZYDbU4hb44l2/IcJkVdmW25hno196B0zl0k0W2kcqpwlVRzKF2Ga6I5ytTVWYYrhptJlwtdjiuHZ3o2dWx5xi7ooIQ0GtkahLCqtGmPLdPMMazK20rLNYsMJN68q5S/lqKkC8fcvkUHhRk1281tnAkR0kLk3qJtXHwZn9mJtnGlMVmiC/VmBdYHNgt/esoDDZYahOuV5+ny0GNVciKn53Z58IF3CH7geE/jefjBtsJUrJ1oG580LPDxmco8BGHFdlbRVdoIwZMeOvEEukrbuPFgYo2mBRsiDDripVh02x3jSGYiBo1oc4QtkAKPdzLzgITN4Cn1/VjykITdR6Zp1wI5xmMxqbhpwYbJqjRtTns+hDw8YQcsHPBvPG2gMKgdgH81zdxEQSYXAz2bPOmv8SSybrftGzCVqvTnrMnLlASVMY7mE055oYJKLhazq3R8y8hyl+bqDlQi0XBoOtExjreJ2cZNww9UMJYgojed80IFbUJM6o2CvFDB6TJSu8wdqCQGLtg7mzgdqCAEmCpWzz7VhcoaubiW92OpCxUMVLKp77lSFypoAZghb2TWhQoqMQbq+59CXaggc2DTfOayLlNwv+i3PrtUFynLUYh2jg5SJqazzJTnpdRFCkToI8/n9dZFyn4qmCsI789/",
BinaryEncoding.Base64
),
Compression.Deflate
)
),
let
_t = ((type nullable text) meta [Serialized.Text = true])
in
type table [TimeColDL_09_POWER_MILL = _t, #"(blank)" = _t]
),
#"Renamed Columns" = Table.RenameColumns(
Source, {{"(blank)", "Power"}, {"TimeColDL_09_POWER_MILL", "Timestamp"}}
),
#"Changed Type" = Table.TransformColumnTypes(
#"Renamed Columns", {{"Timestamp", type datetime}, {"Power", Int64.Type}}
),
//These are the important steps...
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Energy" = Table.AddColumn(
#"Added Index",
"Energy",
each Duration.TotalHours([Timestamp] - #"Added Index"{List.Max({[Index] - 1, 0})}[Timestamp]) * [Power], type number),
#"Extracted Date" = Table.TransformColumns(#"Added Energy", {{"Timestamp", DateTime.Date, type date}}),
#"Grouped Rows" = Table.Group(
#"Extracted Date", {"Timestamp"}, {{"Energy for Day", each List.Sum([Energy]), type number}} )
in
#"Grouped Rows"
Hi @caseski,
I consider power written every 2 seconds as kW per hour so I used daily logic calculation as:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZZJbm0hDES38pUxUtzhJluJsv9t/KLJDOcNLyVTmOL4fX9/KH0SfwqJ/WP+Uv+a9jE+6ONnvNb8j7Xs1uKL6I81wZrPIUSi2miWJ+dhNGVyo1ne3AaJlFqjWR6nDmUl87eGl1en4RU5o9HI0RAt043GzrlYrLTxzMtz6DDGr5Esy0EjmafXWyPbcqKF7DEbzbIcMmZVRHMsWZYDbU4hb44l2/IcJkVdmW25hno196B0zl0k0W2kcqpwlVRzKF2Ga6I5ytTVWYYrhptJlwtdjiuHZ3o2dWx5xi7ooIQ0GtkahLCqtGmPLdPMMazK20rLNYsMJN68q5S/lqKkC8fcvkUHhRk1281tnAkR0kLk3qJtXHwZn9mJtnGlMVmiC/VmBdYHNgt/esoDDZYahOuV5+ny0GNVciKn53Z58IF3CH7geE/jefjBtsJUrJ1oG580LPDxmco8BGHFdlbRVdoIwZMeOvEEukrbuPFgYo2mBRsiDDripVh02x3jSGYiBo1oc4QtkAKPdzLzgITN4Cn1/VjykITdR6Zp1wI5xmMxqbhpwYbJqjRtTns+hDw8YQcsHPBvPG2gMKgdgH81zdxEQSYXAz2bPOmv8SSybrftGzCVqvTnrMnLlASVMY7mE055oYJKLhazq3R8y8hyl+bqDlQi0XBoOtExjreJ2cZNww9UMJYgojed80IFbUJM6o2CvFDB6TJSu8wdqCQGLtg7mzgdqCAEmCpWzz7VhcoaubiW92OpCxUMVLKp77lSFypoAZghb2TWhQoqMQbq+59CXaggc2DTfOayLlNwv+i3PrtUFynLUYh2jg5SJqazzJTnpdRFCkToI8/n9dZFyn4qmCsI789/", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [TimeColDL_09 = _t, Power = _t]),
#"Filtered Rows" = Table.SelectRows(Source, each ([Power] <> "0" and [Power] <> "" and [Power] <> null)),
#"Changed Type" = Table.TransformColumnTypes(#"Filtered Rows",{{"TimeColDL_09", type datetime}, {"Power", type number}}, "sk-SK"),
#"Inserted Date" = Table.AddColumn(#"Changed Type", "Date", each DateTime.Date([TimeColDL_09]), type date),
#"Grouped Rows" = Table.Group(#"Inserted Date", {"Date"}, {{"Avg Daily Power", each List.Average([Power]) * 24, type number}, {"Daily Detail", each _, type table }})
in
#"Grouped Rows"
Let me know if this is what you need.