Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The FabCon + SQLCon recap series starts April 14th at 8am Pacific. If you’re tracking where AI is going inside Fabric, this first session is a can't miss. Register now

Reply
snortham
New Member

Calculating days between dates in same column

I am trying to calculate the number of days a campaign stayed in each status. With my current formula, if a campaign id has multiple entries on the same day, it counts days in status twice, rather than a 0. I have highlighted a few examples. 

This is the current formula I am using with [days since] being # of days since the campaign was created.

 

Days in status = [days since]-LOOKUPVALUE(campaigns_logs[days since],campaigns_logs[campaign.id],[campaign.id],campaigns_logs[change_date].[Date],CALCULATE(MAX(campaigns_logs[change_date].[Date]),FILTER(campaigns_logs,campaigns_logs[campaign.id]=EARLIER(campaigns_logs[campaign.id])&&campaigns_logs[change_date]<EARLIER(campaigns_logs[change_date]))))

 

 
 days in status.PNG

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @snortham ,

 

Here I suggest you to add an index column by group of [campaign.id] and [change_date].

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjE3sVTSUTIyMDLSNzTUNwKyHZNLMstSdX2AhFKsDlYlnnkKAUX5yanFxdhUmGBTYWqIpMLQBKs9mGowLTJDVmJGWIkFYZuMDAkaYwTytUd+Toquc05mal4JVjXG2AMPRY0FdnOMkdVYYjcHQw0x5mB6C6HESN8UTUUsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [compaign.id = _t, change_date = _t, #"previous_value - key" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"compaign.id", Int64.Type}, {"change_date", type date}, {"previous_value - key", type text}}),
    #"Renamed Columns" = Table.RenameColumns(#"Changed Type",{{"compaign.id", "campaign.id"}}),
    #"Grouped Rows" = Table.Group(#"Renamed Columns", {"campaign.id", "change_date"}, {{"Rows", each _, type table [campaign.id=nullable number, change_date=nullable date, #"previous_value - key"=nullable text]}}),
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.AddIndexColumn([Rows],"Index",1)),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"previous_value - key", "Index"}, {"Custom.previous_value - key", "Custom.Index"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Custom",{"Rows"}),
    #"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"Custom.previous_value - key", "previous_value - key"}, {"Custom.Index", "Index"}}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Renamed Columns1",{{"Index", Int64.Type}})
in
    #"Changed Type1"

For refernece: Create Row Number for Each Group in Power BI using Power Query

New table looks like as below.

RicoZhou_0-1671436640249.png

Then update your calculated column code as below.

Days in status =
IF (
    campaigns_logs[Index] = 1,
    [days since]
        - LOOKUPVALUE (
            campaigns_logs[days since],
            campaigns_logs[campaign.id], [campaign.id],
            campaigns_logs[change_date].[Date],
                CALCULATE (
                    MAX ( campaigns_logs[change_date].[Date] ),
                    FILTER (
                        campaigns_logs,
                        campaigns_logs[campaign.id] = EARLIER ( campaigns_logs[campaign.id] )
                            && campaigns_logs[change_date] < EARLIER ( campaigns_logs[change_date] )
                    )
                )
        ),
    0
)

 

Best Regards,
Rico Zhou

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

3 REPLIES 3
Smshinde21
New Member

Hi All,

 

I'm looking for a support.

 

I want to calculate number of days between first and last date in the same column where I'm having multiple same ID entries with multiple dates

 

Can I get support with the DAX function logic? 🙏

Anonymous
Not applicable

Hi @snortham ,

 

Here I suggest you to add an index column by group of [campaign.id] and [change_date].

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjE3sVTSUTIyMDLSNzTUNwKyHZNLMstSdX2AhFKsDlYlnnkKAUX5yanFxdhUmGBTYWqIpMLQBKs9mGowLTJDVmJGWIkFYZuMDAkaYwTytUd+Toquc05mal4JVjXG2AMPRY0FdnOMkdVYYjcHQw0x5mB6C6HESN8UTUUsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [compaign.id = _t, change_date = _t, #"previous_value - key" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"compaign.id", Int64.Type}, {"change_date", type date}, {"previous_value - key", type text}}),
    #"Renamed Columns" = Table.RenameColumns(#"Changed Type",{{"compaign.id", "campaign.id"}}),
    #"Grouped Rows" = Table.Group(#"Renamed Columns", {"campaign.id", "change_date"}, {{"Rows", each _, type table [campaign.id=nullable number, change_date=nullable date, #"previous_value - key"=nullable text]}}),
    #"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.AddIndexColumn([Rows],"Index",1)),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"previous_value - key", "Index"}, {"Custom.previous_value - key", "Custom.Index"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Custom",{"Rows"}),
    #"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"Custom.previous_value - key", "previous_value - key"}, {"Custom.Index", "Index"}}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Renamed Columns1",{{"Index", Int64.Type}})
in
    #"Changed Type1"

For refernece: Create Row Number for Each Group in Power BI using Power Query

New table looks like as below.

RicoZhou_0-1671436640249.png

Then update your calculated column code as below.

Days in status =
IF (
    campaigns_logs[Index] = 1,
    [days since]
        - LOOKUPVALUE (
            campaigns_logs[days since],
            campaigns_logs[campaign.id], [campaign.id],
            campaigns_logs[change_date].[Date],
                CALCULATE (
                    MAX ( campaigns_logs[change_date].[Date] ),
                    FILTER (
                        campaigns_logs,
                        campaigns_logs[campaign.id] = EARLIER ( campaigns_logs[campaign.id] )
                            && campaigns_logs[change_date] < EARLIER ( campaigns_logs[change_date] )
                    )
                )
        ),
    0
)

 

Best Regards,
Rico Zhou

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

danextian
Super User
Super User

Hi @snortham ,

 

You can use SUMX to virtually remove the duplicates within a measure.

=
SUMX (
    // create a virtual table of unique rows of campaign id, status and days in status
    SUMMARIZE (
        'Table',
        'Table'[Campaign ID],
        'Table'[Status],
        'Table'[Days in status]
    ),
    'Table'[Days in status]
)

 

=SUMX(VALUES(





Dane Belarmino | Microsoft MVP | Proud to be a Super User!

Did I answer your question? Mark my post as a solution!


"Tell me and I’ll forget; show me and I may remember; involve me and I’ll understand."
Need Power BI consultation, get in touch with me on LinkedIn or hire me on UpWork.
Learn with me on YouTube @DAXJutsu or follow my page on Facebook @DAXJutsuPBI.

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Power BI Update Carousel

Power BI Community Update - March 2026

Check out the March 2026 Power BI update to learn about new features.