Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I have a table, with a column called "Entry Page" that contains the URL that a user enter our website with. These URL's contain UTM codes.
Example: https://www.mywebsite.com/?utm_source=twitter&utm_medium=paid&utm_campaign=competition
I want to create three new columns in the table (e.g. "UTM Source", "UTM Medium" and "UTM Campaign") and extract the data from the Entry Page URL to populate the columns.
E.g.
Entry Page | UTM Source | UTM Medium | UTM Campaign |
https://www.mywebsite.com/?utm_source=twitter&utm_medium=paid&utm_campaign=competition | paid | competition |
How do I accomplish this with DAX?
I.e. I want to do this, but in Power BI: https://mode.com/blog/parsing-utm-parameters-in-sql
Solved! Go to Solution.
You can also do it from the Query Editor's interface.
Please see the steps in attached file's Query Editor
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("pc0xCsMwDIXhu3guccjQoWBykMQU1xGtoJJNLCFy+zpTp05dP97jXxb3Eqnt5r2ZDXQYPBoKDLmQn1Xo3oruGYIYisC+6jhO19MJNlQKNeH2xZyow5NDv1cQFCzs4uXfiPKPDHKfc3q7GD8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Column1 = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}}), #"Inserted Text After Delimiter" = Table.AddColumn(#"Changed Type", "Text After Delimiter", each Text.AfterDelimiter([Column1], "?"), type text), #"Split Column by Delimiter" = Table.SplitColumn(#"Inserted Text After Delimiter", "Text After Delimiter", Splitter.SplitTextByDelimiter("&", QuoteStyle.Csv), {"Text After Delimiter.1", "Text After Delimiter.2", "Text After Delimiter.3"}), #"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Split Column by Delimiter", {"Column1"}, "Attribute", "Value"), #"Removed Columns" = Table.RemoveColumns(#"Unpivoted Columns",{"Attribute"}), #"Split Column by Delimiter1" = Table.SplitColumn(#"Removed Columns", "Value", Splitter.SplitTextByDelimiter("=", QuoteStyle.Csv), {"Value.1", "Value.2"}), #"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter1",{{"Value.1", type text}, {"Value.2", type text}}), #"Pivoted Column" = Table.Pivot(#"Changed Type1", List.Distinct(#"Changed Type1"[Value.1]), "Value.1", "Value.2") in #"Pivoted Column"
You can also do it from the Query Editor's interface.
Please see the steps in attached file's Query Editor
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("pc0xCsMwDIXhu3guccjQoWBykMQU1xGtoJJNLCFy+zpTp05dP97jXxb3Eqnt5r2ZDXQYPBoKDLmQn1Xo3oruGYIYisC+6jhO19MJNlQKNeH2xZyow5NDv1cQFCzs4uXfiPKPDHKfc3q7GD8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Column1 = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}}), #"Inserted Text After Delimiter" = Table.AddColumn(#"Changed Type", "Text After Delimiter", each Text.AfterDelimiter([Column1], "?"), type text), #"Split Column by Delimiter" = Table.SplitColumn(#"Inserted Text After Delimiter", "Text After Delimiter", Splitter.SplitTextByDelimiter("&", QuoteStyle.Csv), {"Text After Delimiter.1", "Text After Delimiter.2", "Text After Delimiter.3"}), #"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Split Column by Delimiter", {"Column1"}, "Attribute", "Value"), #"Removed Columns" = Table.RemoveColumns(#"Unpivoted Columns",{"Attribute"}), #"Split Column by Delimiter1" = Table.SplitColumn(#"Removed Columns", "Value", Splitter.SplitTextByDelimiter("=", QuoteStyle.Csv), {"Value.1", "Value.2"}), #"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter1",{{"Value.1", type text}, {"Value.2", type text}}), #"Pivoted Column" = Table.Pivot(#"Changed Type1", List.Distinct(#"Changed Type1"[Value.1]), "Value.1", "Value.2") in #"Pivoted Column"
Thank you so much for this! I finally got around to implementing this.
I had to do a few minor tweaks to make it work 100% with my data, but it was very easy with the help of your code.