Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
NikkiSavage
Helper II
Helper II

Weight loss percentage with multiple customers and multiple weight dates

I have a set of data which contains multiple customers and multiple dates and weights for each customer. For example, customer 18 had 3 weight measurements taken in May. I need to work out the % weight from the most recent date to the closest date to 3 months prior. I.e. if the most recent weigh in was 4 September, I want to look up the closest weight to 4 June (which in this case is 8 June) and calculate the movement i.e. She was 55.5kg on 8 June and 56.6kg on 4 September so the variance is 1.1kg or 2%. Having multiple lines and multiple dates within a month makes this particularly hard. Any suggestions on ideas on how to help are most welcome. Picture of data below as won't let me put in this message or copy and paste a table.

16 REPLIES 16
Anonymous
Not applicable

Hi @NikkiSavage ,


Could you tell me if your problem has been solved?
If it is, kindly Accept the useful reply as the solution. More people will benefit from it.
Or you are still confused about it, please provide me with more details about your problem.


Best Regards,
Stephen Tao

 

NikkiSavage
Helper II
Helper II

NikkiSavage_0-1635141794192.png

 

Hi @NikkiSavage ,

 

Can you copy your example data and paste it into the 'Enter Data' section in Power Query please? Then open Advanced Editor for that query, copy all of the M code, then paste it into a code window (the '</>' icon) here.

 

This makes it really easy for people to get a sample of your data to work with and therefore a quicker solution for you.

 

Note that this will almost certainly require a DAX solution even though you have posted in the Power Query forum. If you specifically need it done in PQ then make that very clear as it will be a significantly harder job to do.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




Hi. Code copied I think as you have noted above. Happy for a solution in any form - be it in Power Query or DAX. Thanks for your help so far - much appreciated.

Hi @NikkiSavage ,

 

The code you copied still points to the Excel file on your C: drive which, obviously, no one on here has access to.

You need to copy your sample data table, then paste it into the ENTER DATA table. This creates a JSON source for the data that anyone here can import into their Power Query.

BA_Pete_0-1635227886925.png

 

When you've done it correctly, the first line(s) of the M code should look like this:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WSlTSUTIxVYrViVZK...lots and lots of gobbledegook...

 

Pete

Hi Pete.

Thank you - hopefully the response I have just posted looks better? Appreciate your help. Thanks. Nikki

Yes, that's perfect, thanks.

 

So, which columns do you have available in your source data, and which ones do you need to be calculated dynamically?

I'm assuming it's probably [From Date], [FromWeightKG], and [ChangeKG] that need to be calculated, but please confirm.

 

Thanks,

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




All of these columns are in my source data. What I need to do is pick up the last time each patient was weighed and find the nearest weigh in date from 3 months prior to that point and then calculate the percentage weight change. So for John Smith for example, I need it to pick up that the latest weigh in was 4 September and the nearest weigh in 3 months prior to that was 8 June so the weight difference is the "from weight" on 8th June of 61 and the "to weight" on 4 September which is 62.25 and the variace is 1.25kg or 2% gain. The actual data has hundreds of patients and there is no set weigh in date or regularity. Hopefully this makes sense? Thank you 🙂

Hi @NikkiSavage ,

 

Yes, sorry, was being thick.

 

So, I've made this a bit simpler by adding in a column [dt3mth] in Power Query first:

 

Date.AddMonths([ToDate], -3)

 

 

This makes it significantly easier (IMO) to get the three months date range to work within.

 

Once you have that new field, then you have the measures:

 

//This is just a display measure so you can see the date that the new FromWeight is being selected from

_fromDate3mth = 
VAR __cust =
SELECTEDVALUE(weights[CustomerCode])
VAR __toDate =
SELECTEDVALUE(weights[ToDate])
VAR __dt3mth =
SELECTEDVALUE(weights[dt3mth])
RETURN
CALCULATE(
    MIN(weights[ToDate]),
    ALL(weights),
    weights[CustomerCode] = __cust,
    KEEPFILTERS(weights[ToDate] >= __dt3mth),
    KEEPFILTERS(weights[ToDate] <= __toDate)
)

 

 

 

// This is the main measure that grabs the new FromWeight

_fromWeight3mth = 
VAR __cust =
SELECTEDVALUE(weights[CustomerCode])
VAR __toDate =
SELECTEDVALUE(weights[ToDate])
VAR __dt3mth =
SELECTEDVALUE(weights[dt3mth])
VAR __minDate =
CALCULATE(
    MIN(weights[ToDate]),
    ALL(weights),
    weights[CustomerCode] = __cust,
    KEEPFILTERS(weights[ToDate] >= __dt3mth),
    KEEPFILTERS(weights[ToDate] <= __toDate)
)
RETURN
CALCULATE(
    MAX(weights[ToWeightKG]),
    ALLEXCEPT(weights, weights[CustomerCode]),
    weights[ToDate] = __minDate
)

 

 

 

// Basic measure for new variance

_change3mth = SUM(weights[ToWeightKG]) - [_fromWeight3mth]

 

 

 

// Basic measure for new % variance

_change3mth% = DIVIDE([_change3mth], [_fromWeight3mth], 0)

 

 

These all give me the following output:

BA_Pete_0-1635250858731.png

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("rZbBbtswDIZfZci5EiRakqPj2sswYECBHosejM1A0rXO4KaH7elHSZRNCwZWCbuEMZB8Py2Sv/j4ePh8e3e4OXy9nCYMD6/n6wmjBvz4NjyPb9fzd/wKCrRQRwF+efBCGXwQWiqL0XWyCwEk2MPTTRW3R3QpogmX0ZVIx5HE14GlE7USZ4WCki2UjC+uKdQyDWeSgJIusJT0mVwJ7YQ2pQLlqaTJ7EooILdUwLq7pUABXcnUnEkC9PY6JcqLdD+8v3yK6Gl8C8zwwy+X+fznMrHO0SvSC3ApzR6Dhci0Cp8qqdg8upQQ6SyReszwSqoTWpUS+Po24bTN9EostpEtNQTILvKMXPGVXMPTJRElj5HXUdpGVmM7ni5pYMVs5FkJC76ea0oRxZghVCKBp0r8NKb44l0jVAsNpYJiSLMxvofTeH4ZwjC9z79OvwM0/OF+vryO0/Bj5F3q2CDE2cJjjeXy4cHrbQ98mFwOQ5BJE+aTYfumhEu31mnGIhcyvIFrebokkkzGJ5Px0II13LtIQ4d7D4EqDW/kN6A7fimYxRbDj49ergoNaOyvvtTBQ9YJ7bJCA1nzpLNMAvfSZv4KvjvN5zATt8P8c5wDt9v3nLUp7D+b4mPQjTOQgki5epMCVEOBuyMp0BXhk+lGdi1W81xhMfPgOb5f6eziHabxmgo3zNF0wuvtrHGK35NkZdFwXA6uHtvzUyAN6i8Emoxv4G7MIYqoDdQ0QbtSQaXb3MZZaMPiletLDTwDRReay7GFzE6BZMhrELjyq8GWL/V5tdW5bC7zG7ib9RZSwrApXMM52J2tee0y21y5vW2cPD3c7Dbzq8F7GzkmDNvKQQN5dy2nJnZ5J2ngBlcoRf7HIG9WfhIRi/H0S+me/gI=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [FacilityName = _t, FirstName = _t, LastName = _t, CustomerCode = _t, Wing = _t, FromDate = _t, ToDate = _t, ChangeKG = _t, FromWeightKG = _t, ToWeightKG = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"FacilityName", type text}, {"FirstName", type text}, {"LastName", type text}, {"CustomerCode", Int64.Type}, {"Wing", type text}, {"FromDate", type date}, {"ToDate", type date}, {"ChangeKG", type number}, {"FromWeightKG", type number}, {"ToWeightKG", type number}})
in
    #"Changed Type"

Not quite sure what you want for results (the format), but you can show the information you are looking for by using the Table.Group function in M Code.

 

After

  • Grouping by Customer Code (and First Name, or perhaps FN and LN), 
  • Find the Most Recent Date using List.Max on the ToDate Column
  • Calculate the three month prior using Date.AddMonths
  • After creating a list of all the dates in the To and From columns, calculate the "closest" by finding the minimum absolute difference between the three month prior and the existing dates.
  • Then it is just a matter of matching up the appropriate weights and doing the calculations.

The code below would provide an output like this given the data you provided, and I think the variable names I used in the Table.Group aggregation function should be selfdocumenting as to what is being done at each step.

 

ronrsnfld_0-1635303295348.png

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("rZbBbtswDIZfZci5EiRakqPj2sswYECBHosejM1A0rXO4KaH7elHSZRNCwZWCbuEMZB8Py2Sv/j4ePh8e3e4OXy9nCYMD6/n6wmjBvz4NjyPb9fzd/wKCrRQRwF+efBCGXwQWiqL0XWyCwEk2MPTTRW3R3QpogmX0ZVIx5HE14GlE7USZ4WCki2UjC+uKdQyDWeSgJIusJT0mVwJ7YQ2pQLlqaTJ7EooILdUwLq7pUABXcnUnEkC9PY6JcqLdD+8v3yK6Gl8C8zwwy+X+fznMrHO0SvSC3ApzR6Dhci0Cp8qqdg8upQQ6SyReszwSqoTWpUS+Po24bTN9EostpEtNQTILvKMXPGVXMPTJRElj5HXUdpGVmM7ni5pYMVs5FkJC76ea0oRxZghVCKBp0r8NKb44l0jVAsNpYJiSLMxvofTeH4ZwjC9z79OvwM0/OF+vryO0/Bj5F3q2CDE2cJjjeXy4cHrbQ98mFwOQ5BJE+aTYfumhEu31mnGIhcyvIFrebokkkzGJ5Px0II13LtIQ4d7D4EqDW/kN6A7fimYxRbDj49ergoNaOyvvtTBQ9YJ7bJCA1nzpLNMAvfSZv4KvjvN5zATt8P8c5wDt9v3nLUp7D+b4mPQjTOQgki5epMCVEOBuyMp0BXhk+lGdi1W81xhMfPgOb5f6eziHabxmgo3zNF0wuvtrHGK35NkZdFwXA6uHtvzUyAN6i8Emoxv4G7MIYqoDdQ0QbtSQaXb3MZZaMPiletLDTwDRReay7GFzE6BZMhrELjyq8GWL/V5tdW5bC7zG7ib9RZSwrApXMM52J2tee0y21y5vW2cPD3c7Dbzq8F7GzkmDNvKQQN5dy2nJnZ5J2ngBlcoRf7HIG9WfhIRi/H0S+me/gI=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [FacilityName = _t, FirstName = _t, LastName = _t, CustomerCode = _t, Wing = _t, FromDate = _t, ToDate = _t, ChangeKG = _t, FromWeightKG = _t, ToWeightKG = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"FacilityName", type text}, {"FirstName", type text}, {"LastName", type text}, 
        {"CustomerCode", Int64.Type}, {"Wing", type text}, {"FromDate", type date}, {"ToDate", type date}, 
        {"ChangeKG", type number}, {"FromWeightKG", type number}, {"ToWeightKG", type number}}),
 
    #"Grouped Rows" = Table.Group(#"Changed Type", {"CustomerCode","FirstName"}, {
        {"Three Month Change", (t)=> 
            let 
                mostRecentDt = List.Max(t[ToDate]),
                threeBefore = Date.AddMonths(mostRecentDt,-3),
                allDates = List.Combine({t[ToDate],t[FromDate]}),
                diff = List.Transform(allDates, each Number.Abs(Duration.TotalDays(threeBefore-_))),
                positionMinDiff = List.PositionOf(diff, List.Min(diff)),
                closest = List.Range(allDates,positionMinDiff,1){0},
                mostRecentWt= List.Range(t[ToWeightKG] ,List.PositionOf(t[ToDate], mostRecentDt),1){0},
                allWeights = List.Combine({t[ToWeightKG],t[FromWeightKG]}),
                threeMonthWt = List.Range(allWeights,positionMinDiff,1){0},
                threeMonthChg = threeMonthWt-mostRecentWt
            in 
                Text.Combine({Text.From(threeMonthChg), Text.From(threeMonthChg/threeMonthWt)},";"), type text}
        }),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Grouped Rows", "Three Month Change", 
        Splitter.SplitTextByDelimiter(";", QuoteStyle.Csv), {"Three Month Wt ChangeKG", "Three Month % Change"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter",{
        {"Three Month Wt ChangeKG", type number}, {"Three Month % Change", Percentage.Type}})
in
    #"Changed Type1"

 

Thanks Pete. Looks like it works for you so I am hopeful it works for me. However I am copying and pasting the formula in and not getting any luck. For example, when putting the formula for (_fromdate3month), it just returns 1 January 2021 for every single line. I am updating to reference my sheet but still having trouble.

NikkiSavage_0-1635297134813.png

 

Hi @NikkiSavage ,

 

Everything that starts with '_' is a measure, not a calculated column.

Calculated columns in DAX are not best practice.

The only column I added was the Date.AddMonths one in Power Query.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




Hi @NikkiSavage ,

 

The code you copied still points to the Excel file on your C: drive which, obviously, no one on here has access to.

You need to copy your sample data table, then paste it into the ENTER DATA table. This creates a JSON source for the data that anyone here can import into their Power Query.

BA_Pete_0-1635227886925.png

 

When you've done it correctly, the first line(s) of the M code should look like this:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WSlTSUTIxVYrViVZK...lots and lots of gobbledegook...

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




let
    Source = Excel.Workbook(File.Contents("C:\Users\61431\Desktop\Example Weight Loss.xlsx"), null, true),
    Sheet1_Sheet = Source{[Item="Sheet1",Kind="Sheet"]}[Data],
    #"Promoted Headers" = Table.PromoteHeaders(Sheet1_Sheet, [PromoteAllScalars=true]),
    #"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"FacilityName", type text}, {"FirstName", type text}, {"LastName", type text}, {"CustomerCode", Int64.Type}, {"Wing", type text}, {"FromDate", type date}, {"ToDate", type date}, {"ChangeKG", type number}, {"FromWeightKG", type number}, {"ToWeightKG", type number}, {"Column11", type any}, {"Column12", type any}, {"Column13", type any}, {"Column14", type any}})
in
    #"Changed Type"

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors