Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
melmell
Frequent Visitor

Add a unique identifier column to blocks with repeating values

Hey All.

I have time series data of the following structure.

The "Stage" values are repeating across the table for different periods of time.

secondsStage
1A
2A
3B
4B
5B
6B
7C
8C
9A

 

I would like to add a "Stage_ID" column which identifies each stage period uniquelly.

secondsStageStage_ID
1A111
2A111
3B222
4B222
5B222
6B222
7C333
8C333
9A444

Is there a way to achieve this with Power Query / DAX?

 

1 ACCEPTED SOLUTION
Jimmy801
Community Champion
Community Champion

Hello @melmell 

 

here a similar approach like @CNENFRNL , but stage period numbered starting from 1, 2 etc.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUXJUitWJVjKCs4yBLCcwywTOMoWzzOAscyDLGcyygLMsIabEAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [seconds = _t, Stage = _t]),
    GroupLocal = Table.Group
    (
        Source,
        "Stage",
        {{"allrows", each _}},
        GroupKind.Local
    ),
    AddIndex = Table.AddIndexColumn(GroupLocal, "Index", 1, 1),
    ExpandTable = Table.ExpandTableColumn(AddIndex, "allrows", {"seconds"}, {"seconds"})
in
    ExpandTable

 

Copy paste this code to the advanced editor in a new blank query to see how the solution works.

If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too

Have fun

Jimmy

 

 

View solution in original post

6 REPLIES 6
v-alq-msft
Community Support
Community Support

Hi, @melmell 

 

Based on your description, I created data to reproduce your scenario. The pbix file is attached in the end.

Table:

c1.png

 

You may create a calculated column as below.

Column = 
var tab = 
ADDCOLUMNS(
    'Table',
    "Flag",
    var laststage = 
    LOOKUPVALUE('Table'[Stage],'Table'[Seconds],[Seconds]-1,Blank())
    return
    IF(
        ISBLANK(laststage)||laststage<>[Stage],
        1,0
    )
)
var newtab = 
ADDCOLUMNS(
    tab,
    "Result",
    var val = 
    SUMX(
        FILTER(
            tab,
            [Seconds]<=EARLIER('Table'[Seconds])
        ),
        [Flag]
    )
    return
    val*100+val*10+val
)
return
SUMX(
    FILTER(
        newtab,
        [Seconds]=EARLIER('Table'[Seconds])&&
        [Stage]=EARLIER('Table'[Stage])
    ),
    [Result]
)

 

Result:

c2.png

 

Best Regards

Allan

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Jimmy801
Community Champion
Community Champion

Hello @melmell 

 

here a similar approach like @CNENFRNL , but stage period numbered starting from 1, 2 etc.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUXJUitWJVjKCs4yBLCcwywTOMoWzzOAscyDLGcyygLMsIabEAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [seconds = _t, Stage = _t]),
    GroupLocal = Table.Group
    (
        Source,
        "Stage",
        {{"allrows", each _}},
        GroupKind.Local
    ),
    AddIndex = Table.AddIndexColumn(GroupLocal, "Index", 1, 1),
    ExpandTable = Table.ExpandTableColumn(AddIndex, "allrows", {"seconds"}, {"seconds"})
in
    ExpandTable

 

Copy paste this code to the advanced editor in a new blank query to see how the solution works.

If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too

Have fun

Jimmy

 

 

CNENFRNL
Community Champion
Community Champion

Hi, @melmell , you may want to try this pattern

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUXJUitWJVjKCs4yBLCcwywTOMoWzzOAscyDLGcyygLMsIabEAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [seconds = _t, Stage = _t]),
    bins = Table.Group(Source, {"Stage"}, {{"bin", each List.Min([seconds])}}, GroupKind.Local)[bin],
    #"Added Custom" = Table.AddColumn(Source, "Stage_ID", each [pos = List.PositionOf(bins, [seconds]), result = if pos=-1 then null else Text.Repeat(Text.From(pos+1),3)][result]),
    #"Filled Down" = Table.FillDown(#"Added Custom",{"Stage_ID"})
in
    #"Filled Down"

 

Screenshot 2020-10-06 161911.png


Thanks to the great efforts by MS engineers to simplify syntax of DAX! Most beginners are SUCCESSFULLY MISLED to think that they could easily master DAX; but it turns out that the intricacy of the most frequently used RANKX() is still way beyond their comprehension!

DAX is simple, but NOT EASY!

Hey @CNENFRNL ,

I tried your solution. It works well for small datasets.

I have a 200K table and the processing hung and didn't finish.

Thinking about using Python

Jimmy801
Community Champion
Community Champion

Hello @melmell 

 

did you check my solution? I'm wondering how the performance is of  this code

 

BR

 

Jimmy

Hey @Jimmy801 

Just finished the testing. Worked like a charm.

Performence wise  - 200K rows loaded within several minutes.

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.