Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.

Reply
PLeducq
New Member

Average of Variable number of columns

Dear Power Query gods,


Here is my case (a bit different from the classic Sales tables!): My analysis requires to extract a raw .csv file containing a variable (but known in advance) number of columns. Those columns correspond to the numerical results of each frame of an experiment that lasts for a fixed duration D of 10 seconds. The number of columns in the file depends on the frequency at which the data is acquired. For example, if the data is acquired at a frequency of 50 frames per second, then the table has 500 columns of results + the first column ("Depth (mm)" in the example below) =501 columns. If the data is acquired at a frequency of 100 frames per second, then the table arrives with 1001 columns.

 

Say, for our example, that the experiment is acquired at 50 frames per second:

 

Depth (mm)T=0.02 second(s)T=0.04 second(s)T=0.06 second(s)...T=10 second(s)

1

2.000.003.60...5.00
250.0060.0010.00...9999.65
..................


First, I need to create one new column that calculates the average value, for each row, of every result column.
In other words, I need a dynamic equivalent equivalent of :
"= Table.AddColumn(#"Raw Data", "Total Average", each List.Average({[T=0.02 second(s)],[T=0.04 second(s)],...,[T=10.00 second(s)]})"

which contains all my columns, whatever how many there are in the raw data.

 

Next, similarly, I need to create 10 new columns to caclualte the average results on every time stamp of Ts = D/10 = 1 second, i.e. I want to create a first column that calculates, for each row, the average values of columns "T=0.02 second(s)" to column "T=1.00 second(s)" , then I need another column that calculates the average values of column "T=1.02 second(s)" to column "T=2 second(s)" and so on until the 10th new column which calculates the average values of column "T=9.02 second(s)" to column "T=10.00 second(s)".

I've be struggling with this issue for days and couldn't find the answer on the forum yet. I tried to transpose my table but it didn't made it easier. I suppose I should use some sort of "For" or "While" equivalent loop, but I don't know how to adress my columns dynamically. 
Unless someone has an easier solution in mind...?
Thank you so much for your help!

1 ACCEPTED SOLUTION

Related to this, I did have one other thought. If you must have the average column in Power Query (your destination is an Excel worksheet for example, not the data model) then the same concept.

  1. Unpivot as above
  2. Do a group by, calculating the average
  3. Merge the grouped table with the source step. 

you get this

 

edhans_0-1663351479277.png

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTICYgMgNtYzA5KmSrE60WAxU5CgGYgwBBGWQKBnBpSOBQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Depth (mm)" = _t, #"T=0.02 second(s)" = _t, #"T=0.04 second(s)" = _t, #"T=0.06 second(s)" = _t, #"T=10 second(s)" = _t]),
    #"Unpivoted Other Columns" = Table.UnpivotOtherColumns(Source, {"Depth (mm)"}, "Attribute", "Value"),
    #"Changed Type" = Table.TransformColumnTypes(#"Unpivoted Other Columns",{{"Value", Currency.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Depth (mm)"}, {{"Average Time", each List.Average([Value]), type nullable number}}),
    Average = Source,
    #"Merged Queries" = Table.NestedJoin(Source, {"Depth (mm)"}, #"Grouped Rows", {"Depth (mm)"}, "Custom1", JoinKind.LeftOuter),
    #"Expanded Custom1" = Table.ExpandTableColumn(#"Merged Queries", "Custom1", {"Average Time"}, {"Average Time"})
in
    #"Expanded Custom1"

 



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

View solution in original post

6 REPLIES 6
edhans
Super User
Super User

I recommend you unpivot the variable columns and do the averaging in DAX. Your final table would look like this:

edhans_0-1663348753429.png

  1. Select the Depth column
  2. Right-click and Unpivot Other columns. It doesn't matter if there are 5 or 500 columns. It just works.
  3. Convert the Value to a number if it wasn't already, and rename Attribute and Value to something more meaningful.

Now use DAX to do averages which can be done over depth, the T= column, or a combination of those. You can easily count too, quickly seeing that 1mm has 4 timestamp captures.

 

Full M code.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTICYgMgNtYzA5KmSrE60WAxU5CgGYgwBBGWQKBnBpSOBQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Depth (mm)" = _t, #"T=0.02 second(s)" = _t, #"T=0.04 second(s)" = _t, #"T=0.06 second(s)" = _t, #"T=10 second(s)" = _t]),
    #"Unpivoted Other Columns" = Table.UnpivotOtherColumns(Source, {"Depth (mm)"}, "Attribute", "Value"),
    #"Changed Type" = Table.TransformColumnTypes(#"Unpivoted Other Columns",{{"Value", Currency.Type}})
in
    #"Changed Type"

 

How to use M code provided in a blank query:
1) In Power Query, select New Source, then Blank Query
2) On the Home ribbon, select "Advanced Editor" button
3) Remove everything you see, then paste the M code I've given you in that box.
4) Press Done
5) See this article if you need help using this M code in your model.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Related to this, I did have one other thought. If you must have the average column in Power Query (your destination is an Excel worksheet for example, not the data model) then the same concept.

  1. Unpivot as above
  2. Do a group by, calculating the average
  3. Merge the grouped table with the source step. 

you get this

 

edhans_0-1663351479277.png

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTICYgMgNtYzA5KmSrE60WAxU5CgGYgwBBGWQKBnBpSOBQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Depth (mm)" = _t, #"T=0.02 second(s)" = _t, #"T=0.04 second(s)" = _t, #"T=0.06 second(s)" = _t, #"T=10 second(s)" = _t]),
    #"Unpivoted Other Columns" = Table.UnpivotOtherColumns(Source, {"Depth (mm)"}, "Attribute", "Value"),
    #"Changed Type" = Table.TransformColumnTypes(#"Unpivoted Other Columns",{{"Value", Currency.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"Depth (mm)"}, {{"Average Time", each List.Average([Value]), type nullable number}}),
    Average = Source,
    #"Merged Queries" = Table.NestedJoin(Source, {"Depth (mm)"}, #"Grouped Rows", {"Depth (mm)"}, "Custom1", JoinKind.LeftOuter),
    #"Expanded Custom1" = Table.ExpandTableColumn(#"Merged Queries", "Custom1", {"Average Time"}, {"Average Time"})
in
    #"Expanded Custom1"

 



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Thank you so much edhans, it eventually worked!!
I had indeed to do it in Power query as I have more transformations to bring to the table afterwards, and your methodology did solve my issue. It required a little additional trick to do the average for each second, but that was quite easily solved.
Below is an extract of my code related to this issue for further reference:

 

let
  Source = Folder.Files(
    "C:\Users\PLeducq\....."
  ), 
  #"Filtered Rows" = Table.SelectRows(Source, each [Extension] = ".csv"), 
  #"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows", {"Content"}), 
  #"Filtered Hidden Files1" = Table.SelectRows(
    #"Removed Other Columns", 
    each [Attributes]?[Hidden]? <> true
  ), 
  #"Invoke Custom Function1" = Table.AddColumn(
    #"Filtered Hidden Files1", 
    "Transform File (4)", 
    each #"Transform File (4)"([Content])
  ), 
  #"Removed Other Columns1" = Table.SelectColumns(
    #"Invoke Custom Function1", 
    {"Transform File (4)"}
  ), 
  #"Expanded Table Column1" = Table.ExpandTableColumn(
    #"Removed Other Columns1", 
    "Transform File (4)", 
    Table.ColumnNames(#"Transform File (4)"(#"Sample File (4)"))
  ), 
   #"Depth (mm)" = Table.AddIndexColumn(#"Renamed Columns", "Depth (mm)", 1, 1, Int64.Type), 
   #"Unpivoted Other Columns" = Table.UnpivotOtherColumns(
    #"Expanded Table Column1", 
    {"Depth (mm)"}, 
    "Attribute", 
    "Value"
  ), 
    #"Changed Type" = Table.TransformColumnTypes(
    #"Replaced Value", 
    {{"Attribute", Int64.Type}, {"Value", Number.Type}}
  ), 
  #"Renamed Columns1" = Table.RenameColumns(#"Changed Type", {{"Attribute", "Time Stamp"}}), 
  #"Added Conditional Column" = Table.AddColumn(
    #"Renamed Columns1", 
    "Time (s)", 
    each 
      if [Time Stamp]
        <= Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else if [Time Stamp]
        <= 2
        * Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        2 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else if [Time Stamp]
        <= 3
        * Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        3 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else if [Time Stamp]
        <= 4
        * Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        4 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else if [Time Stamp]
        <= 5
        * Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        5 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else if [Time Stamp]
        <= 6
        * Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        6 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else if [Time Stamp]
        <= 7
        * Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        7 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else if [Time Stamp]
        <= 8
        * Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        8 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else if [Time Stamp]
        <= 9
        * Table.ColumnCount(#"Depth (mm)")
        / List.Max(#"PBI Software Config"[#"Acqusition Time (s)"])
      then
        9 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
      else
        10 * List.Max(#"PBI Software Config"[#"Time Resolution (s)"])
  ), 
  #"Changed Type4" = Table.TransformColumnTypes(
    #"Added Conditional Column", 
    {{"Time (s)", type duration}}
  ), 
  #"Reordered Columns1" = Table.ReorderColumns(
    #"Changed Type4", 
    {"Depth (mm)", "Time Stamp", "Time (s)", "Value"}
  ), 
  #"Average Per Time Resolution" = Table.Group(
    #"Reordered Columns1", 
    {"Depth (mm)", "Time (s)"}, 
    {{"Average Profile", each List.Average([Value]), type nullable number}}
  )
in
  #"Average Per Time Resolution"

Thank you again,

Paul 

Great @PLeducq - glad I was able to assist in moving the project forward a bit. Looks like you have a good handle on it now, but post a new question if you get stuck or just want advice on a direction to go.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Thank you again @edhans , you definitely made it progress! Given how fast and efficient the forum is, I won't hesitate to come back if I'm stuck again!

Cheers,

Paul

Any comments or questions on this @PLeducq 



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors