Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi Everyone,
This is my first community post, I apolgise in advance if the structure of the question is unclear (I have read throught the guidelines of asking questions so will try to stick them as much as possible !)
I'm currently trying to compute the relative weight of a security for a portfolio, grouped by quarter and fund. The base table (attached) looks something like this:
fund | quarter | security | dollar_position |
A | 31/12/2021 | VVV | 125 |
A | 31/12/2021 | XXX | 75 |
A | 30/09/2021 | VVV | 100 |
A | 30/09/2021 | XXX | 100 |
B | 31/12/2021 | YYY | 60 |
B | 31/12/2021 | ZZZ | 20 |
B | 30/09/2021 | YYY | 80 |
B | 30/09/2021 | ZZZ | 20 |
I would like to get a new colum: weight that would indicate the weight of a security for a given fund for a given quarter:
fund | quarter | security | dollar_position | weight |
A | 31/12/2021 | VVV | 125 | 0.625 |
A | 31/12/2021 | XXX | 75 | 0.375 |
A | 30/09/2021 | VVV | 100 | 0.500 |
A | 30/09/2021 | XXX | 100 | 0.500 |
B | 31/12/2021 | YYY | 60 | 0.750 |
B | 31/12/2021 | ZZZ | 20 | 0.250 |
B | 30/09/2021 | YYY | 80 | 0.800 |
B | 30/09/2021 | ZZZ | 20 | 0.200 |
Just to make sure the magenta value is obtained as follows:
weight = 125 / (125 + 75) = 0.625
I have tried using Table.Group(), some trials in DAX, but with no success, so my current method involves a python script:
import pandas as pd
dataset['weight'] = dataset.groupby(['fund', 'quarter_end_date'])['dollar_position'].transform(lambda x: x/x.sum())
dataset['quarter_end_date'] = dataset['quarter_end_date'].astype(str)
But I feel like this is the kind of data-analysis PowerBI was invented for in the first place, and would prefer to move away from using python scripts for readability by other users with little or no experience in Python or R.
LINK TO FILES FOR REPRODUCING:
https://drive.google.com/drive/folders/1QxzAUMxfBc3J7ZGCrO-sD1hGoQ_NrYHP?usp=sharing
Let me know if there are any issues with the attached dummy report and sample data files. Unfortunately I could not attach the files directly (new-user restrictions I suppose ?)
I've also tried linking the excel to the PowerBI Dashboard using relative-path but I haven't managed (is it even possible ?) so you unfortunately will have to relink it manually =).
Thanks to anyone who took the time to even read the post !
Solved! Go to Solution.
Hi @Anonymous ,
Pste this into a new blank query using Advanced Editor:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTI21Dc00jcyMDIEcsLCwoCkoZGpUqwOFumIiAggaY4ka6BvYImm2cAAuzREM0zaCd3syMhIIGmGQzYqKgpIGiHJIhsN0WuBQxahNxYA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [fund = _t, quarter = _t, security = _t, dollar_position = _t]),
chgTypes = Table.TransformColumnTypes(Source,{{"fund", type text}, {"quarter", type date}, {"security", type text}, {"dollar_position", Int64.Type}}),
groupRows = Table.Group(chgTypes, {"fund", "quarter"}, {{"data", each _, type table [fund=nullable text, quarter=nullable date, security=nullable text, dollar_position=nullable number]}, {"dollar_posn_sum", each List.Sum([dollar_position]), type nullable number}}),
expandData = Table.ExpandTableColumn(groupRows, "data", {"security", "dollar_position"}),
addWeight = Table.AddColumn(expandData, "weight", each [dollar_position] / [dollar_posn_sum])
in
addWeight
This gives me the following output:
Pete
Proud to be a Datanaut!
Hi @Anonymous ,
Pste this into a new blank query using Advanced Editor:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclTSUTI21Dc00jcyMDIEcsLCwoCkoZGpUqwOFumIiAggaY4ka6BvYImm2cAAuzREM0zaCd3syMhIIGmGQzYqKgpIGiHJIhsN0WuBQxahNxYA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [fund = _t, quarter = _t, security = _t, dollar_position = _t]),
chgTypes = Table.TransformColumnTypes(Source,{{"fund", type text}, {"quarter", type date}, {"security", type text}, {"dollar_position", Int64.Type}}),
groupRows = Table.Group(chgTypes, {"fund", "quarter"}, {{"data", each _, type table [fund=nullable text, quarter=nullable date, security=nullable text, dollar_position=nullable number]}, {"dollar_posn_sum", each List.Sum([dollar_position]), type nullable number}}),
expandData = Table.ExpandTableColumn(groupRows, "data", {"security", "dollar_position"}),
addWeight = Table.AddColumn(expandData, "weight", each [dollar_position] / [dollar_posn_sum])
in
addWeight
This gives me the following output:
Pete
Proud to be a Datanaut!
Thank you very much @BA_Pete ! This worked like a charm !
I'm just going to post my adapted Advanced Query for anyone struggling with the Source (It took me longer than expected although I'm a complete newbie to PowerBI)
Perhaps its worth including in your answer ?
Kindest,
JC
let
Source = Excel.Workbook(
File.Contents("C:\Users\..\Downloads\power_bi_question_sample_data.xlsx"),
null,
true
){[Item = "Sheet2", Kind = "Sheet"]}[Data],
#"Promote Header" = Table.PromoteHeaders(Source, [PromoteAllScalars = true]),
#"Transform Column" = Table.TransformColumnTypes(
#"Promote Header",
{
{"fund", type text},
{"quarter_end_date", type date},
{"security", type text},
{"dollar_position", Int64.Type}
}
),
#"Group Rows" = Table.Group(
#"Transform Column",
{"fund", "quarter_end_date"},
{
{
"data",
each _,
type table [
fund = nullable text,
quarter = nullable date,
security = nullable text,
dollar_position = nullable number
]
},
{"dollar_posn_sum", each List.Sum([dollar_position]), type nullable number}
}
),
#"Expand Data" = Table.ExpandTableColumn(#"Group Rows", "data", {"security", "dollar_position"}),
#"Add Column" = Table.AddColumn(
#"Expand Data",
"weight",
each [dollar_position] / [dollar_posn_sum]
),
#"Removed Columns" = Table.RemoveColumns(#"Add Column", {"dollar_posn_sum"})
in
#"Removed Columns"
@