Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I need to create an index column that increases +1 every time the flag value changes back and forth. It needs to be in ascending order by Timestamp as well (which is aligned with Index). I created a basic index from 0. I also tried creating a grouped one (Counter) in PQ but that didn't work as expected. The column 'Desired Result' is what I need. Can anyone please help me? Whether it's in PQ or DAX.
Timestamp | Index | Flag | Counter | Desired Result |
8/20/2016 4:32 | 0 | Bad | 1 | 1 |
8/20/2016 4:38 | 1 | Bad | 2 | 1 |
8/20/2016 4:43 | 2 | Bad | 3 | 1 |
8/20/2016 4:48 | 3 | Bad | 4 | 1 |
8/20/2016 4:53 | 4 | Bad | 5 | 1 |
8/20/2016 4:58 | 5 | Bad | 6 | 1 |
8/20/2016 5:03 | 6 | Bad | 7 | 1 |
8/20/2016 5:08 | 7 | Bad | 8 | 1 |
8/20/2016 5:13 | 8 | Bad | 9 | 1 |
8/20/2016 5:19 | 9 | Good | 1 | 2 |
8/20/2016 5:24 | 10 | Good | 2 | 2 |
8/20/2016 5:29 | 11 | Good | 3 | 2 |
8/20/2016 5:34 | 12 | Good | 4 | 2 |
8/20/2016 5:39 | 13 | Bad | 10 | 3 |
8/20/2016 5:44 | 14 | Bad | 11 | 3 |
8/20/2016 5:49 | 15 | Bad | 12 | 3 |
8/20/2016 5:54 | 16 | Good | 5 | 4 |
8/20/2016 5:59 | 17 | Bad | 13 | 5 |
8/20/2016 6:04 | 18 | Bad | 14 | 5 |
8/20/2016 6:09 | 19 | Good | 6 | 6 |
8/20/2016 6:14 | 20 | Good | 7 | 6 |
8/20/2016 6:20 | 21 | Good | 8 | 6 |
8/20/2016 6:25 | 22 | Bad | 15 | 7 |
Solved! Go to Solution.
@itsme can be done with PQ for sure
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bdIxDoMwDAXQq1SZKzWxk2AYu/QQiKFSd+6/lX67MiKWCMt/OInNuiZ5UD6e0m91YUr3lI/1fH+Od8Ha7hckFimiEFW2SBHHSCxSVEPU2CJFLUZikaIeoLZktkjRFCOxSJGEqLBFiuYYzRa99v3fTRoU4dbZGcUMexRnHDJGNXJWY4Zq3nnszwOrqOa9x/4BQzXvPvYfWUO17mdrON/AUM0ngGO2C+tLRjWfAY4ZMFQ7TaFjXRm+ptMUppBB0GkKErPfzcj/fvRmStv2BQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, Index = _t, Flag = _t, Counter = _t, #"Desired Result" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Timestamp", type datetime}, {"Index", Int64.Type}, {"Flag", type text}, {"Counter", Int64.Type}, {"Desired Result", Int64.Type}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Counter", "Desired Result", "Index"}),
#"Added Index" = Table.AddIndexColumn(#"Removed Columns", "Index", 1, 1, Int64.Type),
#"Added Index1" = Table.AddIndexColumn(#"Added Index", "Index.1", 0, 1, Int64.Type),
#"Merged Queries" = Table.NestedJoin(#"Added Index1", {"Index.1"}, #"Added Index1", {"Index"}, "Added Index1", JoinKind.LeftOuter),
#"Expanded Added Index1" = Table.ExpandTableColumn(#"Merged Queries", "Added Index1", {"Flag"}, {"Flag.1"}),
#"Sorted Rows" = Table.Sort(#"Expanded Added Index1",{{"Index", Order.Ascending}}),
#"Added Custom" = Table.AddColumn(#"Sorted Rows", "Custom", each if [Index] = 1 then 1 else if [Flag] <> [Flag.1] then 1 else 0),
Custom = #"Added Custom"[Custom],
Custom1 = List.Generate(()=>[i=0,j= Custom{0},k=j],
each [i]<List.Count( Custom),
each [i=[i]+1,j=[j]+ Custom{i},k=j],
each [k]),
Combine = List.Combine({Table.ColumnNames(#"Added Custom"),{"desiredIndex"}}),
new = Table.FromColumns(Table.ToColumns(#"Added Custom")&{Custom1},Combine),
#"Removed Columns1" = Table.RemoveColumns(new,{"Index.1", "Flag.1", "Custom"}),
#"Sorted Rows1" = Table.Sort(#"Removed Columns1",{{"Index", Order.Ascending}})
in
#"Sorted Rows1"
@itsme can be done with PQ for sure
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bdIxDoMwDAXQq1SZKzWxk2AYu/QQiKFSd+6/lX67MiKWCMt/OInNuiZ5UD6e0m91YUr3lI/1fH+Od8Ha7hckFimiEFW2SBHHSCxSVEPU2CJFLUZikaIeoLZktkjRFCOxSJGEqLBFiuYYzRa99v3fTRoU4dbZGcUMexRnHDJGNXJWY4Zq3nnszwOrqOa9x/4BQzXvPvYfWUO17mdrON/AUM0ngGO2C+tLRjWfAY4ZMFQ7TaFjXRm+ptMUppBB0GkKErPfzcj/fvRmStv2BQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, Index = _t, Flag = _t, Counter = _t, #"Desired Result" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Timestamp", type datetime}, {"Index", Int64.Type}, {"Flag", type text}, {"Counter", Int64.Type}, {"Desired Result", Int64.Type}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Counter", "Desired Result", "Index"}),
#"Added Index" = Table.AddIndexColumn(#"Removed Columns", "Index", 1, 1, Int64.Type),
#"Added Index1" = Table.AddIndexColumn(#"Added Index", "Index.1", 0, 1, Int64.Type),
#"Merged Queries" = Table.NestedJoin(#"Added Index1", {"Index.1"}, #"Added Index1", {"Index"}, "Added Index1", JoinKind.LeftOuter),
#"Expanded Added Index1" = Table.ExpandTableColumn(#"Merged Queries", "Added Index1", {"Flag"}, {"Flag.1"}),
#"Sorted Rows" = Table.Sort(#"Expanded Added Index1",{{"Index", Order.Ascending}}),
#"Added Custom" = Table.AddColumn(#"Sorted Rows", "Custom", each if [Index] = 1 then 1 else if [Flag] <> [Flag.1] then 1 else 0),
Custom = #"Added Custom"[Custom],
Custom1 = List.Generate(()=>[i=0,j= Custom{0},k=j],
each [i]<List.Count( Custom),
each [i=[i]+1,j=[j]+ Custom{i},k=j],
each [k]),
Combine = List.Combine({Table.ColumnNames(#"Added Custom"),{"desiredIndex"}}),
new = Table.FromColumns(Table.ToColumns(#"Added Custom")&{Custom1},Combine),
#"Removed Columns1" = Table.RemoveColumns(new,{"Index.1", "Flag.1", "Custom"}),
#"Sorted Rows1" = Table.Sort(#"Removed Columns1",{{"Index", Order.Ascending}})
in
#"Sorted Rows1"
Thank you! It worked!
Just a note, it takes a long time to load into PBI! But I was expecting that as I tested with other solutions that didn't work. My dataset has 8700 rows, which isn't even the full dataset. It took between 5-10 min to load. My original dataset has over 3M rows, which I will need to use after I finish building out this PBI. A little worried about that, fingers crossed! There is a decimal column that wasn't included in this example but is part of my dataset, which is what the flag is based on. I wonder if that's the main culprit. I would think that the datetime column takes up more memory.
Either way, thank you!
@itsme it is a known fact that PQ has performance issue. On top of that, my solution involves do..while loop, which is going to delay processing even further. Your dataet is 3M+ rows. Are you getting it from some SQL db or some other RDBMS. If yes, it would be way wiser to see if you can achieve this server-side, rather than relying on PQ for a 3 mil dataset. There are some known tweaks that you can refer to for PQ's performance tuning from @ImkeF 's blog. But running a list.generate on a 3 mil ds is not going to be lightning fast for sure. I don't think DAX has the option of running a similar loop or else it would hzve been way quicker. You can keep this thread open and ask the DAX experts to take a look to see if it can be achievable. If that is not possible and if @parry2k's solution meets your purpose, I petsonally think that would be a quicker option in your case. But I would strongly advise you to do this on server-side.
@smpa01 Sorry for the super delayed response! This was for a client, I did advise them that we should do this on the server side but it seems that these are .csv exports from an application they're using and they didn't seem to have the resources to put this in a DB. Since they were only refreshing the data 1x/month and creating those columns in Excel was pretty easy, we ended up doing that and then bringing it into PBI to create the DAX measures, etc. I will try @parry2k's solution to see if it works out! Thanks for your help. BTW that new Animated Sales Calendar is pretty cool!
Hello @itsme ,
codewise, @parry2k s solution should already run faster than @smpa01 solution - have you tried that?
But I believe you could speed that up further using this approach:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bZI7DoNADESvEm2NlLX3k2VLmhwCUURKz/27mJEsIzsSUAxPw/qZfU/jyVku6o86C6clZbm3z1eelI7FAQOxAhyAWhArUCIwECtQA9AKYgVaBAZiBboD2swFsQKvCAzECowAUEGswBqBFfGS3uf5z1SbfI1A2RDvSpCrgMgQb6vNghY2xPsSBC1mVL7pkYoWc0rxuBUtZpXicRtaup3Fr0YQtJhZ8hP1mdFibslPJAhabnb9ivvEMHyz65fcJ97yza5fsyDXvGx/q8x/HD8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, Index = _t, Flag = _t, Counter = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Timestamp", type datetime}, {"Index", Int64.Type}, {"Flag", type text}, {"Counter", Int64.Type}}, "en-us"),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Flag"}, {{"Partition", each _}}, GroupKind.Local),
#"Added Index" = Table.AddIndexColumn(#"Grouped Rows", "Index", 1, 1, Int64.Type),
#"Expanded Partition" = Table.ExpandTableColumn(#"Added Index", "Partition", {"Timestamp", "Index", "Counter"}, {"Timestamp", "Index.1", "Counter"})
in
#"Expanded Partition"
This will run particularly faster if the data comes in sorted already like in your sample.
The 4th parameter of the Table.Group (GroupKind.Local) contains the "secret sauce" for it.
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
@itsme start a new power query, click on advanced editor and paste this code and you can see the steps and apply on your table:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bdIxDoMwDAXQq1SZKzWxk2AYu/QQiKFSd+6/lX67MiKWCMt/OInNuiZ5UD6e0m91YUr3lI/1fH+Od8Ha7hckFimiEFW2SBHHSCxSVEPU2CJFLUZikaIeoLZktkjRFCOxSJGEqLBFiuYYzRa99v3fTRoU4dbZGcUMexRnHDJGNXJWY4Zq3nnszwOrqOa9x/4BQzXvPvYfWUO17mdrON/AUM0ngGO2C+tLRjWfAY4ZMFQ7TaFjXRm+ptMUppBB0GkKErPfzcj/fvRmStv2BQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Timestamp = _t, Index = _t, Flag = _t, Counter = _t, #"Desired Result" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Timestamp", type datetime}, {"Index", Int64.Type}, {"Flag", type text}, {"Counter", Int64.Type}, {"Desired Result", Int64.Type}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Desired Result", "Counter"}),
#"Added Index" = Table.AddIndexColumn(#"Removed Columns", "Index.1", 1, 1, Int64.Type),
#"Merged Queries" = Table.NestedJoin(#"Added Index", {"Index"}, #"Added Index", {"Index.1"}, "Added Index", JoinKind.LeftOuter),
#"Expanded Added Index" = Table.ExpandTableColumn(#"Merged Queries", "Added Index", {"Flag"}, {"Flag.1"}),
#"Sorted Rows" = Table.Sort(#"Expanded Added Index",{{"Index.1", Order.Ascending}}),
#"Added Custom" = Table.AddColumn(#"Sorted Rows", "Match", each if [Flag]<>[Flag.1] then [Index] else null, Int64.Type),
#"Filled Down" = Table.FillDown(#"Added Custom",{"Match"}),
#"Removed Columns1" = Table.RemoveColumns(#"Filled Down",{"Index", "Index.1", "Flag.1"}),
#"Grouped Rows1" = Table.Group(#"Removed Columns1", {"Flag", "Match"}, {{"All", each _, type table [Timestamp=nullable datetime, Flag=nullable text, Match=number]}}),
#"Added Index1" = Table.AddIndexColumn(#"Grouped Rows1", "Index", 1, 1, Int64.Type),
#"Expanded All" = Table.ExpandTableColumn(#"Added Index1", "All", {"Timestamp"}, {"Timestamp"}),
#"Removed Columns2" = Table.RemoveColumns(#"Expanded All",{"Match"})
in
#"Removed Columns2"
✨ Follow us on LinkedIn
Check my latest blog post The Power of Using Calculation Groups with Inactive Relationships (Part 1) (perytus.com) I would ❤ Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos to whoever helped to solve your problem. It is a token of appreciation!
⚡ Visit us at https://perytus.com, your one-stop-shop for Power BI-related projects/training/consultancy.⚡
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
Haven't tried this solution but thank you.