Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Prepping for a Fabric certification exam? Join us for a live prep session with exam experts to learn how to pass the exam. Register now.

Reply
RShackelford
Frequent Visitor

Yet Another Consecutive Values Question

I've seen this question asked quite a bit, but I can't seem to find a solution that works for me or doesn't thoroughly confuse me. 

I'd like to create a calculated column that counts consecutive "bad" indicators. I don't care so much about single instances, good or bad. I'd like an ouput similar to the Result column below.  

 

If measure would work better, that's fine with me. But it's a calculation I'm going to use quite a bit and, ultimately, I'd like to be able to pull some numbers that say "we had X amount of 'consecutive bads' this month on Y amount of servers and the average number of consecutives (in terms of duration, 2.3 consecutive bads or whatever) was Z". 

 

ServerDateIndicatorResult
A1/1/2021GoodDon't care
A1/2/2021BadDon’t care
A1/3/2021GoodDon't care
A1/4/2021Bad2 Bads
A1/5/2021Bad2 Bads
B1/1/2021GoodDon't care
B1/2/2021GoodDon't care
B1/3/2021GoodDon't care
B1/4/2021BadDon't care
B1/5/2021GoodDon't care
C1/1/2021BadDon't care
C1/2/2021GoodDon't care
C1/3/2021Bad3 Bads
C1/4/2021Bad3 Bads
C1/5/2021Bad3 Bads
1 ACCEPTED SOLUTION

Hi, @RShackelford 

Thank you for your feedback.

Please check the below.

 

Picture1.png

 

Step one Second Version =
VAR currentserver = Data[Server]
VAR currentdate = Data[Date]
VAR flagcumulate =
IF (
Data[Indicator] = "good",
BLANK (),
SUMX (
ADDCOLUMNS (
SUMMARIZE (
FILTER ( Data, Data[Server] = currentserver && Data[Date] <= currentdate ),
Data[Server],
Data[Date],
Data[Indicator]
),
"@flag", IF ( Data[Indicator] = "good", 1, 0 )
),
[@flag]
)
)
RETURN
flagcumulate
 
 
Result CC Second Version =
VAR currentstepone = Data[Step one Second Version]
VAR currentserver = Data[Server]
VAR newtable =
FILTER (
Data,
Data[Server] = currentserver
&& Data[Step one Second Version] == currentstepone
)
VAR result =
COUNTROWS ( newtable )
RETURN
IF ( Data[Indicator] = "good" || result <= 1, "Don't care", result & "Bads" )
 
 
 
 


Microsoft MVP



If this post helps, then please consider accepting it as the solution to help other members find it faster, and give a big thumbs up.




LinkedInVisit my LinkedIn page




Outlook BookingSchedule a short Teams meeting to discuss your question





View solution in original post

4 REPLIES 4
Jihwan_Kim
Super User
Super User

Hi, @RShackelford 

Please check the below picture and the sample pbix file's link down below. It is for creating a new column.

 

Picture1.png

 

Step one CC =
VAR currentserver = Data[Server]
VAR rankbydate =
RANKX ( FILTER ( data, Data[Server] = currentserver ), Data[Date],, DESC )
VAR flag =
IF (
Data[Indicator] = "Bad"
&& MAXX (
FILTER (
Data,
Data[Server] = currentserver
&& RANKX ( FILTER ( data, Data[Server] = currentserver ), Data[Date],, DESC ) = rankbydate + 1
),
Data[Indicator]
) = "Bad",
1
)
VAR flagtwo =
IF (
Data[Indicator] = "Bad"
&& MAXX (
FILTER (
Data,
Data[Server] = currentserver
&& RANKX ( FILTER ( data, Data[Server] = currentserver ), Data[Date],, DESC ) = rankbydate - 1
),
Data[Indicator]
) = "Bad",
1
)
RETURN
flag + flagtwo
 
 
Result CC =
VAR currentserver = Data[Server]
VAR steponeCCcountrow =
COUNTROWS (
FILTER ( Data, Data[Server] = currentserver && Data[Step one CC] <> BLANK () )
)
RETURN
IF ( NOT ISBLANK ( Data[Step one CC] ), steponeCCcountrow & " Bads" )
 
 


Microsoft MVP



If this post helps, then please consider accepting it as the solution to help other members find it faster, and give a big thumbs up.




LinkedInVisit my LinkedIn page




Outlook BookingSchedule a short Teams meeting to discuss your question





Thank You @Jihwan_Kim

 

That worked for the dataset I provided in the example. However, I need to separate out the different consecutive instances within the same server (See server C in the table below). My apologies, I should have specified this in the original example. 

 

ServerDateIndicatorResult
A1/1/2021GoodDon't care
A1/2/2021BadDon’t care
A1/3/2021GoodDon't care
A1/4/2021Bad2 Bads
A1/5/2021Bad2 Bads
B1/1/2021GoodDon't care
B1/2/2021GoodDon't care
B1/3/2021GoodDon't care
B1/4/2021BadDon't care
B1/5/2021GoodDon't care
C1/1/2021BadDon't care
C1/2/2021GoodDon't care
C1/3/2021Bad3 Bads
C1/4/2021Bad3 Bads
C1/5/2021Bad3 Bads
C1/6/2021GoodDon't care
C1/7/2021Bad2 Bads
C1/8/2021Bad2 Bads

Hi, @RShackelford 

Thank you for your feedback.

Please check the below.

 

Picture1.png

 

Step one Second Version =
VAR currentserver = Data[Server]
VAR currentdate = Data[Date]
VAR flagcumulate =
IF (
Data[Indicator] = "good",
BLANK (),
SUMX (
ADDCOLUMNS (
SUMMARIZE (
FILTER ( Data, Data[Server] = currentserver && Data[Date] <= currentdate ),
Data[Server],
Data[Date],
Data[Indicator]
),
"@flag", IF ( Data[Indicator] = "good", 1, 0 )
),
[@flag]
)
)
RETURN
flagcumulate
 
 
Result CC Second Version =
VAR currentstepone = Data[Step one Second Version]
VAR currentserver = Data[Server]
VAR newtable =
FILTER (
Data,
Data[Server] = currentserver
&& Data[Step one Second Version] == currentstepone
)
VAR result =
COUNTROWS ( newtable )
RETURN
IF ( Data[Indicator] = "good" || result <= 1, "Don't care", result & "Bads" )
 
 
 
 


Microsoft MVP



If this post helps, then please consider accepting it as the solution to help other members find it faster, and give a big thumbs up.




LinkedInVisit my LinkedIn page




Outlook BookingSchedule a short Teams meeting to discuss your question





@Jihwan_Kim 

Thank You! That's exactly what I needed.

Your solution is much cleaner than many I've seen. 

Helpful resources

Announcements
May PBI 25 Carousel

Power BI Monthly Update - May 2025

Check out the May 2025 Power BI update to learn about new features.

Notebook Gallery Carousel1

NEW! Community Notebooks Gallery

Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.

May 2025 Monthly Update

Fabric Community Update - May 2025

Find out what's new and trending in the Fabric community.