The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi everyone,
My aim is very simple yet somehow I couldn't find the ressources to do this task, I was hoping you could help me.
So my goal is to do a simple group by for all the columns in my worksheet with data, I am aware that with "Advanced" functionnality you can add multiple "New column name", but the issue is I have more than 100 columns in my data. Is there a simple way to ask to do the grouping with Operation "Sum" to all remaining columns in my dataset (excluding the column used for grouping of course). Doing manually 1 by 1 a hundred times "Add Aggregation" is really not an idea that please me I must say 😄
Thanks a lot for your help.
Solved! Go to Solution.
Hi @AnonymeC
let
Source = YourSource,
UnPivot = Table.UnpivotOtherColumns(Source, {"Code"}, "Attribute", "Value"),
Pivot = Table.Pivot(UnPivot, List.Distinct(UnPivot[Attribute]), "Attribute", "Value", List.Sum)
in
Pivot
or
let
Source = YourSource,
Group = Table.Group(Source, {"Code"},
List.Transform(
List.Skip(Table.ColumnNames(Source)),
each {_, Expression.Evaluate("each List.Sum(["& _ &"])", #shared), type number } ) )
in
Group
Stéphane
Thank you all for your feedback, the code shared by Stéphane worked perfectly fine so I took it as solution, but appreciate the other answers, will give it a try.
Hi @AnonymeC
let
Source = YourSource,
UnPivot = Table.UnpivotOtherColumns(Source, {"Code"}, "Attribute", "Value"),
Pivot = Table.Pivot(UnPivot, List.Distinct(UnPivot[Attribute]), "Attribute", "Value", List.Sum)
in
Pivot
or
let
Source = YourSource,
Group = Table.Group(Source, {"Code"},
List.Transform(
List.Skip(Table.ColumnNames(Source)),
each {_, Expression.Evaluate("each List.Sum(["& _ &"])", #shared), type number } ) )
in
Group
Stéphane
If needed I can generate a file.
Hi, please see below an example of data
This is my input:
Code | Jul | Aug | Sep | Q1 | Oct | Nov | Dec | Q2 |
AAAA | 2 | 5 | 7 | 14 | 3 | 10 | 2 | 15 |
AAAA | 0 | 2 | 1 | 3 | 6 | 2 | 1 | 9 |
BBBB | 5 | 5 | 5 | 15 | 5 | 5 | 5 | 15 |
BBBB | 5 | 5 | 5 | 15 | 5 | 5 | 5 | 15 |
CCCC | 3 | 5 | 1 | 9 | 10 | 2 | 2 | 14 |
Expected Output:
Code | Jul | Aug | Sep | Q1 | Oct | Nov | Dec | Q2 |
AAAA | 2 | 7 | 8 | 17 | 9 | 12 | 3 | 24 |
BBBB | 10 | 10 | 10 | 30 | 10 | 10 | 10 | 30 |
CCCC | 3 | 5 | 1 | 9 | 10 | 2 | 2 | 14 |
Except I have much more columns than these 8 here, so doing manually the Add Aggregation is not a suitable solution.
Try this:
let
//Change Source to your own data source
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcgQCJR0lIyA2BWJzIDY0ARLGIIYBVMbQVClWB64WLgpVZobEtwQrdAICqIEwbGiKKUCiUmcggFppCrcOyZVGELfHxgIA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Code = _t, Jul = _t, Aug = _t, Sep = _t, Q1 = _t, Oct = _t, Nov = _t, Dec = _t, Q2 = _t]),
//Set data types
#"Changed Type" = Table.TransformColumnTypes(Source, List.Transform(List.Skip(Table.ColumnNames(Source)), each {_, Int64.Type})),
#"Group by Code" = Table.Group(#"Changed Type","Code",{
{"Sums", (t)=> [a=List.Skip(Table.ColumnNames(t)),
b=List.Transform(a, each List.Sum(Table.Column(t,_))),
c=Record.FromList(b,a)][c]
}}),
#"Expanded Sums" = Table.ExpandRecordColumn(#"Group by Code", "Sums", List.Skip(Table.ColumnNames(Source))),
#"Changed Type2" = Table.TransformColumnTypes(#"Expanded Sums", List.Transform(List.Skip(Table.ColumnNames(#"Expanded Sums")), each {_, Int64.Type}))
in
#"Changed Type2"
Please provide a representative data sample (preferably as text which can be copy/pasted, or a link to a file) along with expected data output from that sample.
It is not clear to me exactly what you want to do and the above will be helpful. In general:
Again, depending on your expected output, you might apply that list to the List.Accumulate, List.Generate, or a recursive function.