Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hello,
I have this data below where I want to convert rows into columns and columns into rows. Please see the expected result below.
Row Data:
Source.Name | Type | Column2 | Column4 | Column5 |
Nov 04,2024- Vendor.xlsx | Airport/Code | YSJ | YFC | |
Nov 04,2024- Vendor.xlsx | Aircraft Type | Commercial | Commercial | |
Nov 04,2024- Vendor.xlsx | Agreed Differential | 0.0797 | 0.0603 | |
Nov 04,2024- Vendor.xlsx | FUEL PRICE - LT/CA$ | 0.8521 | 0.8327 | |
Nov 04,2024- Vendor.xlsx | Airport/Code | YHZ | YQM | |
Nov 04,2024- Vendor.xlsx | Aircraft Type | Commerical (ITS) | Commercial | |
Nov 04,2024- Vendor.xlsx | Agreed Differential | 0.0641 | 0.0592 | |
Nov 04,2024- Vendor.xlsx | FUEL PRICE - LT/CA$ | 0.8365 | 0.8316 | |
Nov 04,2024- Vendor.xlsx | Airport/Code | YHZ-2 | YYG | BOS |
Nov 04,2024- Vendor.xlsx | Aircraft Type | Commercial | Commercial | |
Nov 04,2024- Vendor.xlsx | Agreed Differential | 0.0839 | 0.0971 | 0.08 |
Nov 04,2024- Vendor.xlsx | FUEL PRICE - LT/CA$ | 0.8563 | 0.8695 | 2.1813 |
Nov 11,2024- Vendor.xlsx | Airport/Code | YSJ | YFC | |
Nov 11,2024- Vendor.xlsx | Aircraft Type | Commercial | Commercial | |
Nov 11,2024- Vendor.xlsx | Agreed Differential | 0.0797 | 0.0603 | |
Nov 11,2024- Vendor.xlsx | FUEL PRICE - LT/CA$ | 0.8808 | 0.8614 | |
Nov 11,2024- Vendor.xlsx | Airport/Code | YHZ | YQM | |
Nov 11,2024- Vendor.xlsx | Aircraft Type | Commerical (ITS) | Commercial | |
Nov 11,2024- Vendor.xlsx | Agreed Differential | 0.0641 | 0.0592 | |
Nov 11,2024- Vendor.xlsx | FUEL PRICE - LT/CA$ | 0.8652 | 0.8603 | |
Nov 11,2024- Vendor.xlsx | Airport/Code | YHZ-2 | YYG | BOS |
Nov 11,2024- Vendor.xlsx | Aircraft Type | Commercial | Commercial | |
Nov 11,2024- Vendor.xlsx | Agreed Differential | 0.0839 | 0.0971 | 0.08 |
Nov 11,2024- Vendor.xlsx | FUEL PRICE - LT/CA$ | 0.885 | 0.8982 | 2.263 |
Expected Result
Source.Name | Station | Agreed Differential | FUEL PRICE - LT/CA$ |
Nov 04,2024- Vendor.xlsx | YSJ | 0.0797 | 0.8521 |
Nov 04,2024- Vendor.xlsx | YFC | 0.0603 | 0.8327 |
Nov 04,2024- Vendor.xlsx | YHZ | 0.0641 | 0.8365 |
Nov 04,2024- Vendor.xlsx | YQM | 0.0592 | 0.8316 |
Nov 04,2024- Vendor.xlsx | YHZ-2 | 0.0839 | 0.0971 |
Nov 04,2024- Vendor.xlsx | YYG | 0.8563 | 0.8695 |
Nov 04,2024- Vendor.xlsx | BOS | 0.08 | 2.1813 |
Nov 11,2024- Vendor.xlsx | YSJ | 0.0797 | 0.8808 |
Nov 11,2024- Vendor.xlsx | YFC | 0.0603 | 0.8614 |
Nov 11,2024- Vendor.xlsx | YHZ | 0.0641 | 0.8652 |
Nov 11,2024- Vendor.xlsx | YQM | 0.0592 | 0.8603 |
Nov 11,2024- Vendor.xlsx | YHZ-2 | 0.0839 | 0.885 |
Nov 11,2024- Vendor.xlsx | YYG | 0.0971 | 0.8982 |
Nov 11,2024- Vendor.xlsx | BOS | 0.08 | 2.263 |
Solved! Go to Solution.
Try this:
Assume in your pq the source data is:
Not optimized approach is to combine Column 2, Column 4, Column 5. But it works!
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("zZRdS4RAGIX/yiBdFLjufOg4Xm62WxvbpxbY4oXoGIK7LpPE9u/THQkKzBkL6ublzMU5+D68x/XauK5eAbRNDLE9AY98m1XC2pcve8M0ZoXYVaKe+lXGm2cUXLZz4TcTGLE55E1FktcgfNu1Zr/abLhIi6T8+hiOehacZ+CsyHMu+LaWNmhB13OloJCoBC0e5itwe7/052ACVuHUnx0d/MzBSAqCXcXlPoG5eGrn3dV4MEWalOB4GQYnv4eH2nIr6Hj4R3gIdTqB6Dg8k/YDoui8mac3wX+4HUY8KTy3o8TGnw8lUlCvBYUtxBD5SENofLn6vdqA+qK0y9UT1EuHNWQlHWQrLjdcLi0wKuXSxdNbLl081MGdUOOsXK4/vJ3vyqV9Pt2/x2P40C3clC2O3wE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Source.Name = _t, Type = _t, Column2 = _t, Column4 = _t, Column5 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Source.Name", type text}, {"Type", type text}, {"Column2", type text}, {"Column4", type text}, {"Column5", type text}}),
// Let us merge sets as one; There may be faster way too.
#"Dataset1" = Table.RemoveColumns(#"Changed Type",{"Column4", "Column5"}),
#"Renamed Columns" = Table.RenameColumns(Dataset1,{{"Column2", "DataValue"}}),
#"Dataset2" = Table.RemoveColumns(#"Changed Type",{"Column2", "Column5"}),
#"Renamed Columns1" = Table.RenameColumns(Dataset2,{{"Column4", "DataValue"}}),
#"Dataset3" = Table.RemoveColumns(#"Changed Type",{"Column2", "Column4"}),
#"Renamed Columns2" = Table.RenameColumns(Dataset3,{{"Column5", "DataValue"}}),
#"MergeAllData" = Table.Combine({#"Renamed Columns", #"Renamed Columns1", #"Renamed Columns2"}),
// transform ... pivot .. final output
#"Added Index" = Table.AddIndexColumn(#"MergeAllData", "Index", 1, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "Custom", each Number.RoundDown(([Index]-1)/4)+1),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom.1", each Number.Mod([Index],4)),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Custom.1"}),
#"Removed Columns1" = Table.RemoveColumns(#"Removed Columns",{"Index"}),
#"Pivoted Column" = Table.Pivot(#"Removed Columns1", List.Distinct(#"Removed Columns1"[Type]), "Type", "DataValue"),
#"Replaced Value" = Table.ReplaceValue(#"Pivoted Column"," ",null,Replacer.ReplaceValue,{"Airport/Code"}),
#"Filtered Rows" = Table.SelectRows(#"Replaced Value", each [#"Airport/Code"] <> null and [#"Airport/Code"] <> "")
in
#"Filtered Rows"
note that I added as two step to remove indiex, custom.1 columns. you can combine. I separated them for clarity when going through the data steps in PQ editor.
Steps:
Output:
TODO: Keeping columns: You can decide like custom column, other columns - to keep or not to keep; Decide per your needs.
TODO: Last step rename the final columns per your requirements.
Hope this helps!
Roughly,
Your data includes file names and four fields as one set and data is presented in three columns. The actual data values are in three columns, which you are aware of. First, I combined all three data columns into one, resulting in columns named Source.Name, Type, and DataValue.
Your data row is a set of four fields, each represented as one row in your data.
Now, regarding the code I provided:
AddIndexColumn
in Power Query.
You can ignore the steps involving #Added Custom1
and #Removed Columns
as they were part of testing. If you check these steps you will see only 1, 2, 3, 0 values. This is to make sure my previous step is doing correct sets of 4.
Pivoting by type for data values is the actual trick that converts the set of 4 rows into 4 columns for the entire dataset.
Lastly, I need to remove the empty sets. I achieve this by selecting the first field "Airport/Code" , replacing it with nulls, and then removing the empty rows. If needed, you can optimize certain steps and remove unwanted steps. I did it in step by step approach for you to understand and get the solution first and then consider the optimizing it.😊
Hope this explanation is helpful!
@sevenhills The technique works. Thanks so much!
Little bit complicated for me to understand with these steps but I used these steps to get the results. Thanks so much!
#"Added Index" = Table.AddIndexColumn(#"MergeAllData", "Index", 1, 1, Int64.Type), #"Added Custom" = Table.AddColumn(#"Added Index", "Custom", each Number.RoundDown(([Index]-1)/4)+1), #"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom.1", each Number.Mod([Index],4)), #"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Custom.1"}),
Try this:
Assume in your pq the source data is:
Not optimized approach is to combine Column 2, Column 4, Column 5. But it works!
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("zZRdS4RAGIX/yiBdFLjufOg4Xm62WxvbpxbY4oXoGIK7LpPE9u/THQkKzBkL6ublzMU5+D68x/XauK5eAbRNDLE9AY98m1XC2pcve8M0ZoXYVaKe+lXGm2cUXLZz4TcTGLE55E1FktcgfNu1Zr/abLhIi6T8+hiOehacZ+CsyHMu+LaWNmhB13OloJCoBC0e5itwe7/052ACVuHUnx0d/MzBSAqCXcXlPoG5eGrn3dV4MEWalOB4GQYnv4eH2nIr6Hj4R3gIdTqB6Dg8k/YDoui8mac3wX+4HUY8KTy3o8TGnw8lUlCvBYUtxBD5SENofLn6vdqA+qK0y9UT1EuHNWQlHWQrLjdcLi0wKuXSxdNbLl081MGdUOOsXK4/vJ3vyqV9Pt2/x2P40C3clC2O3wE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Source.Name = _t, Type = _t, Column2 = _t, Column4 = _t, Column5 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Source.Name", type text}, {"Type", type text}, {"Column2", type text}, {"Column4", type text}, {"Column5", type text}}),
// Let us merge sets as one; There may be faster way too.
#"Dataset1" = Table.RemoveColumns(#"Changed Type",{"Column4", "Column5"}),
#"Renamed Columns" = Table.RenameColumns(Dataset1,{{"Column2", "DataValue"}}),
#"Dataset2" = Table.RemoveColumns(#"Changed Type",{"Column2", "Column5"}),
#"Renamed Columns1" = Table.RenameColumns(Dataset2,{{"Column4", "DataValue"}}),
#"Dataset3" = Table.RemoveColumns(#"Changed Type",{"Column2", "Column4"}),
#"Renamed Columns2" = Table.RenameColumns(Dataset3,{{"Column5", "DataValue"}}),
#"MergeAllData" = Table.Combine({#"Renamed Columns", #"Renamed Columns1", #"Renamed Columns2"}),
// transform ... pivot .. final output
#"Added Index" = Table.AddIndexColumn(#"MergeAllData", "Index", 1, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "Custom", each Number.RoundDown(([Index]-1)/4)+1),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom.1", each Number.Mod([Index],4)),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"Custom.1"}),
#"Removed Columns1" = Table.RemoveColumns(#"Removed Columns",{"Index"}),
#"Pivoted Column" = Table.Pivot(#"Removed Columns1", List.Distinct(#"Removed Columns1"[Type]), "Type", "DataValue"),
#"Replaced Value" = Table.ReplaceValue(#"Pivoted Column"," ",null,Replacer.ReplaceValue,{"Airport/Code"}),
#"Filtered Rows" = Table.SelectRows(#"Replaced Value", each [#"Airport/Code"] <> null and [#"Airport/Code"] <> "")
in
#"Filtered Rows"
note that I added as two step to remove indiex, custom.1 columns. you can combine. I separated them for clarity when going through the data steps in PQ editor.
Steps:
Output:
TODO: Keeping columns: You can decide like custom column, other columns - to keep or not to keep; Decide per your needs.
TODO: Last step rename the final columns per your requirements.
Hope this helps!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
13 | |
13 | |
10 | |
8 | |
7 |