Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hello,
I am looking for a method to go from a data table like so
Example Data:
abc_X_123 | abc_X_456 | abc_X_789 | abc_Y_123 | abc_Z_123 | abc_Z_456 |
True | False | null | True | True | null |
False | False | True | False | False | True |
null | True | null | null | True | null |
(Bold and Italicized to make date more readable)
to a data table like this one below.
End Goal:
X | Y | Z |
3 | 1 | 3 |
(True = 1 and False/Null = 0)
My Data is to large to do hard coded transformations, but the gist is there is a name I want to keep, in this case X,Y, and Z, which is always between two underscores. There are sometimes multiple columns for a single name, in this case the columns need to be grouped into one. The end goal just needs to be a table where each name/column has a total for the number of times true appeared.
I am using power query, and I load my data in from a folder. I use the combine and transform option to do so because I will add new data on a weekly basis.
Solved! Go to Solution.
Hi @Rick_S137 ,
Thank you for the clarification regarding the issue you were facing.
Since no specific file was provided, I’ve tested the logic using sample data based on your original example. The approach below successfully transforms the dataset to meet your stated goal aggregating the number of true values by group (e.g., X, Y, Z) on a per-row basis, regardless of how many columns are in each group.
Please refer the screenshot below for your understanding:
Also including .pbix file for you reference please have a look.
I hope this will resolve your issue, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @Rick_S137 ,
Thank you for reaching out to the Microsoft Fabric Community Forum. Also thank you @lbendlin , @AntrikshSharma and @SundarRaj for your inputs.
From what you’ve described and shown in screenshots, the issue seems to stem from inconsistent data types and how "null" (text) is being treated versus actual null values, especially when working with untyped or folder-imported sources.
Here’s an alternative solution that avoids hardcoding and works well with dynamic data (like your folder import setup). It:
Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
AsText = Table.TransformColumns(Source, List.Transform(Table.ColumnNames(Source), each {_, each if _ = null then "null" else Text.Lower(Text.From(_)), type text})),
AsBinary = Table.TransformColumns(AsText, List.Transform(Table.ColumnNames(AsText), each {_, each if _ = "true" then 1 else 0, type number})),
Unpivoted = Table.UnpivotOtherColumns(AsBinary, {}, "Attribute", "Value"),
Extracted = Table.TransformColumns(Unpivoted, {{"Attribute", each Text.Middle(_, Text.PositionOf(_, "_") + 1, Text.PositionOf(Text.Range(_, Text.PositionOf(_, "_") + 1), "_")), type text}}),
Grouped = Table.Group(Extracted, {"Attribute"}, {{"Grouped", each Table.ToList(Table.TransformColumns(_, {"Value", each _})), type list}}),
Final = Table.FromRows(List.Zip(Grouped[Grouped]), Grouped[Attribute])
in
Final
This approach handles "null" values (as a string) gracefully and does not require changing data types, which helps avoid issues with folder imports. It maintains your row structure, meaning it doesn't summarize or aggregate the data. Additionally, it works well even if new columns are added later from files in the folder.
I hope this will resolve your issue, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hello,
Thank you for the help, I am starting to think that there is some sort of other issue with the data, because even after trying everyones code, I still get some sort of error that doesnt seem to make sense, your code worked up until the last step "Grouped" at which point when I look at the lists it generates I get this error.
Hi @Rick_S137 ,
Thank you for your continued engagement and for highlighting the issue you're encountering.
You're right, the error you’re seeing at the Grouped step stems from how that transformation aggregates values into lists, which unintentionally disrupts the per-row structure you're aiming to maintain.
Based on your requirement to count the number of "true" values (as 1s) grouped by dynamic categories like X, Y, Z per row I’d recommend slightly modifying the final part of the transformation to pivot by group instead of grouping into lists.
Here is the revised logic:
<Insert the revised M code from above here>
This should preserve the row structure and give you the desired summarized output without introducing grouping errors.
Let me know if this helps resolve the issue, I’ll be happy to assist further if needed!
If this response resolves your query, please consider marking it as the solution for the benefit of others.
Thank you.
Hi,
I don't know how to write that code would you be able to create it?
Hi @Rick_S137 ,
If possible, could you please provide more details about your data?
How to provide sample data in the Power BI Forum
You can refer the following link to upload the file to the community.
How to upload PBI in Community
Thank you.
Hi @Rick_S137 ,
Thank you for the clarification regarding the issue you were facing.
Since no specific file was provided, I’ve tested the logic using sample data based on your original example. The approach below successfully transforms the dataset to meet your stated goal aggregating the number of true values by group (e.g., X, Y, Z) on a per-row basis, regardless of how many columns are in each group.
Please refer the screenshot below for your understanding:
Also including .pbix file for you reference please have a look.
I hope this will resolve your issue, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @Rick_S137 ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @Rick_S137 ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @Rick_S137 ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @Rick_S137 ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
@Rick_S137 Try this:
let
Source = Table.TransformColumns ( Table, {}, Byte.From ),
Unpivot = Table.UnpivotOtherColumns ( Source, {}, "A", "V" ),
ColNames = Table.TransformColumns (
Unpivot,
{ "A", each Text.BetweenDelimiters ( _, "_", "_" ), type text }
),
Pivot = Table.Pivot ( ColNames, List.Distinct ( ColNames[A] ), "A", "V", List.Sum )
in
Pivot
Alternate approach:
let
Source = Table.Transpose ( Table.DemoteHeaders ( Table ) ),
Unpivot = Table.UnpivotOtherColumns ( Source, { "Column1" }, "A", "V" ),
ColNames = Table.TransformColumns (
Unpivot,
{ { "Column1", each Text.BetweenDelimiters ( _, "_", "_" ), type text }, { "V", Byte.From } }
),
Group = Table.Group ( ColNames, "Column1", { "Sum", each List.Sum ( [V] ) } ),
Pivot = Table.Pivot ( Group, List.Distinct ( Group[Column1] ), "Column1", "Sum", List.Sum )
in
Pivot
Hi,
Your code works well up to the last step where I end up getting this error. I think this has to do with how the nulls are generated in from my original source data, the nulls appear in the table as "null" but when i look at the filter the nulls show up as " (null) "
Then those are literal word "null" and not null so you have to treat them first by replacing or some other operation.
I tried using Replace values and switched all the nulls to 0's but I am still getting the same error
Hi @Rick_S137 , another approach you can look at. Thanks!
Here's the code:
let
Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"abc_X_123", type logical}, {"abc_X_456", type logical}, {"abc_X_789", type logical}, {"abc_Y_123", type logical}, {"abc_Z_123", type logical}, {"abc_Z_456", type logical}}),
Numbers = Table.TransformColumns(#"Changed Type",{},each Number.From(_)),
Demote = Table.FromColumns(Table.ToRows(Table.DemoteHeaders(Numbers))),
Split = Table.SplitColumn(Demote, "Column1", Splitter.SplitTextByEachDelimiter({"_"}, QuoteStyle.Csv, true), {"Column1.1", "Column1.2"}),
Remove = Table.RemoveColumns(Split,{"Column1.2"}),
Group = Table.Group(Remove, {"Column1.1"}, {{"All", each _, type table [Column1.1=nullable text, Column2=nullable number, Column3=number, Column4=nullable number]}}),
List = Table.TransformColumns(Group, {"All", each List.Select(List.Combine(Table.ToColumns(_)), each not (try Number.From(_))[HasError])}),
All = Table.TransformColumns(List,{"All", each List.Sum(List.RemoveNulls(_))}),
Cols = List.Transform(All[Column1.1], each Text.AfterDelimiter(_,"_")),
Table = Table.FromRows(Table.ToColumns(All[[All]]),Cols)
in
Table
After Inputting this code, I get this issue
Remove the #"Changed Type" step as well.
Your end goal is not a desired format. Leave the pivoting to the visuals.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCikqTVXSUXJLzCkG0XmlOTlACioKpcCCsTrRcFUwGk0zijBIPappUB4WwdhYAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [abc_X_123 = _t, abc_X_456 = _t, abc_X_789 = _t, abc_Y_123 = _t, abc_Z_123 = _t, abc_Z_456 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"abc_X_123", type logical}, {"abc_X_456", type logical}, {"abc_X_789", type logical}, {"abc_Y_123", type logical}, {"abc_Z_123", type logical}, {"abc_Z_456", type logical}}),
#"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Changed Type", {}, "Attribute", "Value"),
#"Split Column by Delimiter" = Table.SplitColumn(#"Unpivoted Columns", "Attribute", Splitter.SplitTextByDelimiter("_", QuoteStyle.Csv), {"Attribute.1", "Attribute.2", "Attribute.3"}),
#"Filtered Rows" = Table.SelectRows(#"Split Column by Delimiter", each ([Value] = true)),
#"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows",{"Attribute.2", "Value"}),
#"Grouped Rows" = Table.Group(#"Removed Other Columns", {"Attribute.2"}, {{"Count", each Table.RowCount(_), Int64.Type}})
in
#"Grouped Rows"
How to use this code: Create a new Blank Query. Click on "Advanced Editor". Replace the code in the window with the code provided here. Click "Done". Once you examined the code, replace the entire Source step with your own source.
I've tried changing my source data so all the nulls are replaced with false, because I am interested in true values, but I still get the same error about nulls. I have also tried making all the data 1's and 0's before running your code again run into the error with the Nulls. Is there something else that could be causing this?
Please provide sample data that fully covers your issue.
Please show the expected outcome based on the sample data you provided.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
17 | |
9 | |
8 | |
7 | |
7 |