Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
Anonymous
Not applicable

Calculate how much time issue spent in certain status to get leadtime

Hello,

 

I want to create a Report which shows the leadtime of each issue. But also you should be able to see how much time it was in a certain status. So it should look something like this: 

2022-06-15 14_02_11-Window.png

 

Now I made a query to get the needed data, it looks like this: 

2022-06-15 13_59_32-Window.png

The thing is, an issue can go into the statuses multiple times as you can see.  Created -> Backlog -> Analyzing -> Backlog -> Analyzing etc. 

What there should happen is, each time an issue goes into a certain status, it should get summed up for that status and calculated together in a new column so it looks something like this: 

2022-06-15 14_24_28-Window.png

 

I really hope someone can help me with this because i don't know how to do it. 

5 REPLIES 5
vanessafvg
Super User
Super User

what are you struggling with exactly?  not quite sure i understand what you are trying to do.  are you wanting to do this in the table or in the visual  It makes sense to sum it in a visual 

 

if you want to share your data in a text form or sample data 





If I took the time to answer your question and I came up with a solution, please mark my post as a solution and /or give kudos freely for the effort 🙂 Thank you!

Proud to be a Super User!




Anonymous
Not applicable

oldvalueoldstringnewvaluenewstringidissue_keystatuschangedateissuecreatedissuelastupdatedBacklogTimeAnalyzingTimeReadyForPiPlanning
10321Backlog10807Analyzing117816EART-1004.10.2018 13:3501.10.2018 09:2601.10.2019 11:22   
10807Analyzing10808Ready For PI Planning117816EART-1004.10.2018 13:4201.10.2018 09:2601.10.2019 11:22   
10808Ready For PI Planning10807Analyzing117816EART-1012.10.2018 14:0601.10.2018 09:2601.10.2019 11:22   
10807Analyzing10808Ready For PI Planning117816EART-1012.10.2018 14:1101.10.2018 09:2601.10.2019 11:22   
10808Ready For PI Planning11410Actual PI117816EART-1029.10.2018 10:2001.10.2018 09:2601.10.2019 11:22   
11410Actual PI10111Implementing117816EART-1031.10.2018 11:3601.10.2018 09:2601.10.2019 11:22   
10111Implementing10903TVV117816EART-1007.12.2018 15:5601.10.2018 09:2601.10.2019 11:22   
10903TVV10906Releasing117816EART-1015.01.2019 09:3901.10.2018 09:2601.10.2019 11:22   
10906Releasing10002Done117816EART-1018.01.2019 09:4401.10.2018 09:2601.10.2019 11:22   
10321Backlog10807Analyzing118666EART-100311.03.2019 15:2411.03.2019 15:2326.09.2019 08:09   
10807Analyzing10321Backlog118666EART-100326.03.2019 17:0011.03.2019 15:2326.09.2019 08:09   
10321Backlog10807Analyzing118666EART-100311.06.2019 16:3811.03.2019 15:2326.09.2019 08:09   
10807Analyzing10808Ready For PI Planning118666EART-100327.06.2019 07:5711.03.2019 15:2326.09.2019 08:09   
10808Ready For PI Planning11410Actual PI118666EART-100308.07.2019 10:1411.03.2019 15:2326.09.2019 08:09   
11410Actual PI3In Progress118666EART-100308.07.2019 10:1611.03.2019 15:2326.09.2019 08:09   
3In Progress10111Implementing118666EART-100308.07.2019 11:0611.03.2019 15:2326.09.2019 08:09   
10111Implementing10905Validating on Staging (Ready for Demo)118666EART-100319.09.2019 11:2911.03.2019 15:2326.09.2019 08:09   
10905Validating on Staging (Ready for Demo)10002Done118666EART-100326.09.2019 08:0911.03.2019 15:2326.09.2019 08:09   
10321Backlog10807Analyzing118667EART-100412.03.2019 11:0112.03.2019 10:1711.07.2019 11:44   
10807Analyzing10808Ready For PI Planning118667EART-100426.03.2019 18:5912.03.2019 10:1711.07.2019 11:44   
10808Ready For PI Planning11410Actual PI118667EART-100405.04.2019 13:0412.03.2019 10:1711.07.2019 11:44   
11410Actual PI3In Progress118667EART-100409.05.2019 12:0012.03.2019 10:1711.07.2019 11:44   
3In Progress10904Validating on Staging (In Progress)118667EART-100424.06.2019 22:1512.03.2019 10:1711.07.2019 11:44   
10904Validating on Staging (In Progress)10002Done118667EART-100402.07.2019 08:1512.03.2019 10:1711.07.2019 11:44   
10321Backlog10002Done118668EART-100513.03.2019 13:0312.03.2019 10:2113.03.2019 13:03   

Here is the Data 

Hi @Anonymous ,

 

Do you want to catch the last status change date instead of issue created date when status goes back to backlog again?

I suggest you can try this code.

BacklogTime =
VAR _Lastchangedate =
    CALCULATE (
        MAX ( status_dates_better[statuschangedate] ),
        FILTER (
            ALLEXCEPT ( status_dates_better, status_dates_better[issue_key] ),
            status_dates_better[statuschangedate]
                < EARLIER ( status_dates_better[statuschangedate] )
        )
    )
RETURN
    IF (
        status_dates_better[oldstring] = "Backlog",
        DATEDIFF (
            IF (
                _Lastchangedate = BLANK (),
                status_dates_better[issuecreated],
                _Lastchangedate
            ),
            status_dates_better[statuschangedate],
            DAY
        )
    )

Result is as below.

RicoZhou_0-1655793735324.png

 

Best Regards,
Rico Zhou

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

Anonymous
Not applicable

Thank you! I will try this but its just a part of the whole solution I need. I dont know how get the previous changedate per issue_key so that i can calculate in each row how long it was in that status. Can you help me with that ? 

Anonymous
Not applicable

I struggle with the DAX Formula to get those timecalculations. 
Because when an Issue is created it goes by default first in the status Backlog. When it goes from Backlog into Analyzing you can just do a

IF(status_dates_better[oldstring]="Backlog", DATEDIFF(status_dates_better[issuecreated],status_dates_better[statuschangedate],DAY))

But when it goes back to backlog again, you cant use the CreatedDate when it leaves that status for the second time... Then you would have to say somehow take the date when it left that status for second time

And thats where my difficulty is.

 

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.