Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Dear all,
I've been hitting my head on this problem for a while now, and I've read around a lot and tried many solution but nothing seems to fit. My hunch says it's got to do with recursive calculations and the fact that Power Bi cannot perform them.
I'm trying to create a dashboard on Power Bi from a very easy excel file but I cannot seem do be able to do it.
The main source data are a buffer level, and a valve: the buffer gets (partially) emptied when the valve is open (valve value = 1), and I'd like to calculate what the level decrease is everytime that the valve is open (and then closed, valve value = 0).
In excel, I simply start summing up the level decrease until the next value of the valve is 0 (valve closed). To do that I reference to the previous row of the same column (in principle it is a cumulative measure based on the previous row). The cumulation needs to stop when the valve is closed, and start over when the valve is open again).
I've recreated the calculations on Power Bi but I get stuck with this last calculation, because the cumulation does not stop and start again as I'd wish. Next time the valve is open, whatever was calculated with the previous unloading gets added to that!
This is the formula I'm using now, and below some screenshots (unfortunately due to the restrictions on my work computer I cannot upload any files):
Cumulative level difference = VAR calc = CALCULATE ( SUM ( 'Table' [Level difference during unloading] ), TOPN( 100, FILTER ( 'Table' , 'Table'[Index] <= EARLIER ( 'Table'[Index] ) ), 'Table' [Index], DESC )) var _if = calc * 'Table' [Valve open/close] RETURN _if |
Any help will be extremely appreciated!
Thanks
Sara
Solved! Go to Solution.
Ah, I have misunderstood the ask. You want it the other way round.
let
Query2 = let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("VZTZkQMhDERT2fI3rpKEDnAqLuefhjl0eL+YYXigVjfzfj9AXtBfAI/2QBtN1rif//bYhMZ5/rS7UH3hgEa+EJt1u4tFzzsOSmAW0B2ABkMuwHt8QmOFIJCKGElIn/8JAE1CglilJDFseFGn0NapgFEAOrC2VPYzaO/yXN9MAiGMDs3Ge7wrVPEXoSV9JMKB4GnqQaxN9rr6Hp+j8cjCyBxRPd06iKxNrySe806ozUA6BAJHy0G0ofJFTqfXxBYXSLgto0Eg1MD0FmZyJ1QwES2EHFnnue885u1gl57ELILTFLR+CXaCLMVzGC8zbcS1pQsxt3VwAlLSCwAmt+QeJZjCeRRQ0YJO3l2OMGZ8BYsI3TspeD3kk8qVAxBLJGwXbTOQ3ig8hHkndGRdErZLr6T01tkRkTth0/O43QgE65SVszl/Qr/zGB5qXXKBCteKsLmJAo6QJaKFYMkn8MLIXVSaiYTvbMcUD1fHe0qfHi7DHkjeeCkfqU2/WqzjvPMsQAqI39Uui+3XlVVW3CytK88jrd/iUd0V7xesOj+fLw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, #"Buffer level" = _t, #"Valve open/close" = _t, #"Level difference" = _t, #"rolling average level difference" = _t, #"corrected level difference" = _t]),
#"Grouped Rows" = Table.Group(Source, {"Valve open/close"}, {{"rows", each process(_)}},GroupKind.Local),
process = (t)=>
let
#"Changed Type" = Table.TransformColumnTypes(t,{{"Valve open/close", Int64.Type},{"corrected level difference", type number}},"nl"),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "Unloading", each [#"Valve open/close"]*[corrected level difference]),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Cumul difference", (k)=> List.Sum(Table.SelectRows(#"Added Custom",each [Index]<=k[Index])[Unloading]))
in
#"Added Custom1"
in
#"Grouped Rows",
#"Expanded rows" = Table.ExpandTableColumn(Query2, "rows", {"Timestamp", "Buffer level", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"}, {"Timestamp", "Buffer level", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"}),
#"Reordered Columns" = Table.ReorderColumns(#"Expanded rows",{"Timestamp", "Buffer level", "Valve open/close", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"})
in
#"Reordered Columns"
Hi, thank you, I thought I had replied but I do not see it anymore.
Because of the restrictions in my work computer I cannot upload any file in external services such as WeTransfer. I copied the tables with the expected result and what I get instead.
Unfortunately Power Query doesn't seem to be an option as one of my columns is a rolling average and that takes too much time/power in Power Query (unless I'm doing something wrong). Do you have a solution for that?
And if yes, which arguments should List.Accumulate have? Thank you!
What I expect:
= D - E | = IF (C =0; 0; F ) | = IF (G = 0; 0; H [n-1] +G ) | ||||||
Timestamp | Buffer level | Valve open/close | Level difference | rolling average level difference | corrected level difference | Level difference during unloading | Cumulative level difference | Filter trucks |
05:03:00 | 178,5 | 0 | 0,528 | 0 | ||||
05:06:00 | 180,2 | 0 | 1,737 | 0,556 | 1,182 | 0 | 0 | 0 |
05:09:00 | 180,3 | 0 | 0,085 | 0,545 | -0,460 | 0 | 0 | 0 |
05:12:00 | 180,8 | 0 | 0,539 | 0,545 | -0,006 | 0 | 0 | 0 |
05:15:00 | 181,7 | 0 | 0,878 | 0,552 | 0,326 | 0 | 0 | 0 |
05:18:00 | 181,1 | 0 | -0,649 | 0,527 | -1,175 | 0 | 0 | 0 |
05:21:00 | 179,4 | 1 | -1,661 | 0,527 | -2,188 | -2,187712627 | -2,187712627 | 0 |
05:24:00 | 171,5 | 1 | -7,948 | 0,538 | -8,486 | -8,486121171 | -10,6738338 | 0 |
05:27:00 | 166,3 | 1 | -5,180 | 0,499 | -5,679 | -5,679383612 | -16,35321741 | 0 |
05:30:00 | 160,1 | 1 | -6,164 | 0,485 | -6,649 | -6,648694671 | -23,00191208 | 0 |
05:33:00 | 158,0 | 1 | -2,076 | 0,575 | -2,651 | -2,651090943 | -25,65300302 | 0 |
05:36:00 | 158,2 | 1 | 0,137 | 0,489 | -0,353 | -0,352762115 | -26,00576514 | 1 |
05:39:00 | 158,4 | 0 | 0,173 | 0,449 | -0,276 | 0 | 0 | 0 |
05:42:00 | 159,7 | 0 | 1,354 | 0,470 | 0,884 | 0 | 0 | 0 |
05:45:00 | 160,7 | 0 | 1,042 | 0,523 | 0,519 | 0 | 0 | 0 |
05:48:00 | 160,8 | 0 | 0,032 | 0,494 | -0,462 | 0 | 0 | 0 |
05:51:00 | 160,2 | 1 | -0,618 | 0,439 | -1,057 | -1,057102974 | -1,057102974 | 0 |
05:54:00 | 156,9 | 1 | -3,280 | 0,409 | -3,689 | -3,688966736 | -4,74606971 | 0 |
05:57:00 | 153,5 | 1 | -3,340 | 0,455 | -3,795 | -3,795230347 | -8,541300057 | 0 |
06:00:00 | 151,9 | 1 | -1,599 | 0,552 | -2,151 | -2,150819885 | -10,69211994 | 0 |
06:03:00 | 150,3 | 1 | -1,677 | 0,450 | -2,127 | -2,126959898 | -12,81907984 | 0 |
06:06:00 | 150,1 | 1 | -0,200 | 0,429 | -0,629 | -0,62893065 | -13,44801049 | 0 |
06:09:00 | 147,7 | 1 | -2,317 | 0,395 | -2,713 | -2,712622644 | -16,16063313 | 1 |
06:12:00 | 150,7 | 0 | 2,961 | 0,468 | 2,493 | 0 | 0 | 0 |
06:15:00 | 150,5 | 0 | -0,247 | 0,439 | -0,686 | 0 | 0 | 0 |
06:18:00 | 148,8 | 0 | -1,616 | 0,402 | -2,017 | 0 | 0 | 0 |
What I get:
Timestamp | Buffer level | Valve open/close | Level difference | rolling average level difference | corrected level difference | Level difference during unloading | Cumulative level difference | Filter trucks |
05:03:00 | 178,5 | 0 | 0,528 | 0 | ||||
05:06:00 | 180,2 | 0 | 1,737 | 0,556 | 1,182 | 0 | 0 | 0 |
05:09:00 | 180,3 | 0 | 0,085 | 0,545 | -0,460 | 0 | 0 | 0 |
05:12:00 | 180,8 | 0 | 0,539 | 0,545 | -0,006 | 0 | 0 | 0 |
05:15:00 | 181,7 | 0 | 0,878 | 0,552 | 0,326 | 0 | 0 | 0 |
05:18:00 | 181,1 | 0 | -0,649 | 0,527 | -1,175 | 0 | 0 | 0 |
05:21:00 | 179,4 | 1 | -1,661 | 0,527 | -2,188 | -2,187712627 | -2,187712627 | 0 |
05:24:00 | 171,5 | 1 | -7,948 | 0,538 | -8,486 | -8,486121171 | -10,6738338 | 0 |
05:27:00 | 166,3 | 1 | -5,180 | 0,499 | -5,679 | -5,679383612 | -16,35321741 | 0 |
05:30:00 | 160,1 | 1 | -6,164 | 0,485 | -6,649 | -6,648694671 | -23,00191208 | 0 |
05:33:00 | 158,0 | 1 | -2,076 | 0,575 | -2,651 | -2,651090943 | -25,65300302 | 0 |
05:36:00 | 158,2 | 1 | 0,137 | 0,489 | -0,353 | -0,352762115 | -26,00576514 | 1 |
05:39:00 | 158,4 | 0 | 0,173 | 0,449 | -0,276 | 0 | 0 | 0 |
05:42:00 | 159,7 | 0 | 1,354 | 0,470 | 0,884 | 0 | 0 | 0 |
05:45:00 | 160,7 | 0 | 1,042 | 0,523 | 0,519 | 0 | 0 | 0 |
05:48:00 | 160,8 | 0 | 0,032 | 0,494 | -0,462 | 0 | 0 | 0 |
05:51:00 | 160,2 | 1 | -0,618 | 0,439 | -1,057 | -1,057102974 | -27,06286811 | 0 |
05:54:00 | 156,9 | 1 | -3,280 | 0,409 | -3,689 | -3,688966736 | -30,75183485 | 0 |
05:57:00 | 153,5 | 1 | -3,340 | 0,455 | -3,795 | -3,795230347 | -34,5470652 | 0 |
06:00:00 | 151,9 | 1 | -1,599 | 0,552 | -2,151 | -2,150819885 | -36,69788508 | 0 |
06:03:00 | 150,3 | 1 | -1,677 | 0,450 | -2,127 | -2,126959898 | -38,82484498 | 0 |
06:06:00 | 150,1 | 1 | -0,200 | 0,429 | -0,629 | -0,62893065 | -39,45377563 | 0 |
06:09:00 | 147,7 | 1 | -2,317 | 0,395 | -2,713 | -2,712622644 | -42,16639827 | 1 |
06:12:00 | 150,7 | 0 | 2,961 | 0,468 | 2,493 | 0 | 0 | 0 |
06:15:00 | 150,5 | 0 | -0,247 | 0,439 | -0,686 | 0 | 0 | 0 |
06:18:00 | 148,8 | 0 | -1,616 | 0,402 | -2,017 | 0 | 0 | 0 |
what's the rolling average window?
your level difference column accuracy seems to be higher than the buffer level accuracy too.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("VZTZkQMhDERT2fI3rpKEDnAqLuefhjl0eL+YYXigVjfzfj9AXtBfAI/2QBtN1rif//bYhMZ5/rS7UH3hgEa+EJt1u4tFzzsOSmAW0B2ABkMuwHt8QmOFIJCKGElIn/8JAE1CglilJDFseFGn0NapgFEAOrC2VPYzaO/yXN9MAiGMDs3Ge7wrVPEXoSV9JMKB4GnqQaxN9rr6Hp+j8cjCyBxRPd06iKxNrySe806ozUA6BAJHy0G0ofJFTqfXxBYXSLgto0Eg1MD0FmZyJ1QwES2EHFnnue885u1gl57ELILTFLR+CXaCLMVzGC8zbcS1pQsxt3VwAlLSCwAmt+QeJZjCeRRQ0YJO3l2OMGZ8BYsI3TspeD3kk8qVAxBLJGwXbTOQ3ig8hHkndGRdErZLr6T01tkRkTth0/O43QgE65SVszl/Qr/zGB5qXXKBCteKsLmJAo6QJaKFYMkn8MLIXVSaiYTvbMcUD1fHe0qfHi7DHkjeeCkfqU2/WqzjvPMsQAqI39Uui+3XlVVW3CytK88jrd/iUd0V7xesOj+fLw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, #"Buffer level" = _t, #"Valve open/close" = _t, #"Level difference" = _t, #"rolling average level difference" = _t, #"corrected level difference" = _t]),
#"Added Index" = Table.AddIndexColumn(Source, "Index", 0, 1, Int64.Type),
#"Changed Type" = Table.TransformColumnTypes(#"Added Index",{{"Timestamp", type time}, {"Buffer level", type number}, {"Level difference", type number}, {"rolling average level difference", type number}, {"corrected level difference", type number}},"nl"),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Unloading", each if [#"Valve open/close"]="0" then 0 else[corrected level difference],type number),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Cumul difference", (k)=> if k[Unloading]=0 then 0 else List.Sum(Table.SelectRows(#"Added Custom",each [Index]<=k[Index])[Unloading]))
in
#"Added Custom1"
How to use this code: Create a new Blank Query. Click on "Advanced Editor". Replace the code in the window with the code provided here. Click "Done". Once you examined the code, replace the Source step with your own source.
Hello,
Thank you for your reply!
Unfortunately this does not solve my problem (I had already got to this result). The cumulative difference does not reset to 0 after the valve is closed again, and that's exactly what I need to calculate how much has been unloaded.
The -27 that I have highlighted in the screenshot below needs to be -1,05, which is basically the sum of the previous cell (0) plus the difference in level now that the valve is open again.
It's a very straightforward operation in Excel but I can't seem to make it work here, every single option I tried leads exactly to this result 😞
Ah, I have misunderstood the ask. You want it the other way round.
let
Query2 = let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("VZTZkQMhDERT2fI3rpKEDnAqLuefhjl0eL+YYXigVjfzfj9AXtBfAI/2QBtN1rif//bYhMZ5/rS7UH3hgEa+EJt1u4tFzzsOSmAW0B2ABkMuwHt8QmOFIJCKGElIn/8JAE1CglilJDFseFGn0NapgFEAOrC2VPYzaO/yXN9MAiGMDs3Ge7wrVPEXoSV9JMKB4GnqQaxN9rr6Hp+j8cjCyBxRPd06iKxNrySe806ozUA6BAJHy0G0ofJFTqfXxBYXSLgto0Eg1MD0FmZyJ1QwES2EHFnnue885u1gl57ELILTFLR+CXaCLMVzGC8zbcS1pQsxt3VwAlLSCwAmt+QeJZjCeRRQ0YJO3l2OMGZ8BYsI3TspeD3kk8qVAxBLJGwXbTOQ3ig8hHkndGRdErZLr6T01tkRkTth0/O43QgE65SVszl/Qr/zGB5qXXKBCteKsLmJAo6QJaKFYMkn8MLIXVSaiYTvbMcUD1fHe0qfHi7DHkjeeCkfqU2/WqzjvPMsQAqI39Uui+3XlVVW3CytK88jrd/iUd0V7xesOj+fLw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, #"Buffer level" = _t, #"Valve open/close" = _t, #"Level difference" = _t, #"rolling average level difference" = _t, #"corrected level difference" = _t]),
#"Grouped Rows" = Table.Group(Source, {"Valve open/close"}, {{"rows", each process(_)}},GroupKind.Local),
process = (t)=>
let
#"Changed Type" = Table.TransformColumnTypes(t,{{"Valve open/close", Int64.Type},{"corrected level difference", type number}},"nl"),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "Unloading", each [#"Valve open/close"]*[corrected level difference]),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Cumul difference", (k)=> List.Sum(Table.SelectRows(#"Added Custom",each [Index]<=k[Index])[Unloading]))
in
#"Added Custom1"
in
#"Grouped Rows",
#"Expanded rows" = Table.ExpandTableColumn(Query2, "rows", {"Timestamp", "Buffer level", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"}, {"Timestamp", "Buffer level", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"}),
#"Reordered Columns" = Table.ReorderColumns(#"Expanded rows",{"Timestamp", "Buffer level", "Valve open/close", "Level difference", "rolling average level difference", "corrected level difference", "Unloading", "Cumul difference"})
in
#"Reordered Columns"
Thank you!! This works indeed.
However I now have the problem of a rolling average that, when performed in Power Query takes waaay two long (and this cumulative calculation depends on the rolling average column).
Is there a way to make it efficient?
This is what I use now:
= let v=#"Added Index"[Value] in Table.AddColumn(#"Added Index", "MA5_PQ", each let pos1 = List.Max({0, [Index]-4}) in List.Average(List.Range(v, pos1, [Index]-pos1+1))) |
Please provide sample data that fully covers your issue.
You cannot do anything recursive in Power BI. Change your process to make it non-recursive or use Power Query (List.Accumulate).
Please provide sample data that covers your issue or question completely, in a usable format (not as a screenshot).
Do not include sensitive information or anything not related to the issue or question.
If you are unsure how to upload data please refer to https://community.fabric.microsoft.com/t5/Community-Blog/How-to-provide-sample-data-in-the-Power-BI-...
Please show the expected outcome based on the sample data you provided.
Want faster answers? https://community.fabric.microsoft.com/t5/Desktop/How-to-Get-Your-Question-Answered-Quickly/m-p/1447...
Check out the October 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
31 | |
16 | |
12 | |
10 | |
8 |
User | Count |
---|---|
59 | |
20 | |
12 | |
11 | |
10 |