Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
andreyminakov
Frequent Visitor

What is generally faster - Table.ReplaceValue or Table.TransformColumns, for multiple replacements?

Hi All!
I’ve got a question - if I need to do several replacements in a table column - what is the fastest way to do that, by design (does NOT matter that one way allows to make more types of transofmations)?
1 ACCEPTED SOLUTION
Anonymous
Not applicable

@andreyminakov,

You can analyze the power query performance by using trace log. See https://www.excelando.co.il/en/analyzing-power-query-performance-source-large-files/.
 

1.PNG

Regards,
Lydia

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

@andreyminakov,

It depends on your data. How would you replace the values in the column? To optimize multiple replacements. you can combine Table.ReplaceValue and Table.TransformColumns functions.

There is a similar thread for your reference:
https://community.powerbi.com/t5/Desktop/Optimizing-multiple-replacements/td-p/102389

Regards,
Lydia

Hi Lydia! Thanks a lot for your replay and useful thread regarding the task of replacement of several chars in a table. But I still would like to understand - what is better way to solve the task from time of execution point of view, and haven't found info about that there.
In general, I can do the task of remooving (as simpler variant of replacement) of a list of characters in all the columns of a table in these ways:

1. Multiple calls of Table.ReplaceValue (it's possible to make these call through List.Generate, but it doesn't influence the time of execution, I guess).
Table.ReplaceValue(Source,".","",Replacer.ReplaceText,{Table.ColumnNames(Source)})
Table.ReplaceValue(Source,",","",Replacer.ReplaceText,{Table.ColumnNames(Source)})
...
2. Call of Table.TransformColumns
Table.TransformColumns(
Source,
List.Zip({
Table.ColumnNames(Source),
List.Repeat(
{each Text.Remove(_, Text.ToList(".:;?!<>@#$%^&*=+"))},
Table.ColumnCount(Source))
})
)
3. Call Custom Function in Table.ReplaceValue
Table.ReplaceValue(
Source,
".:;?!<>@#$%^&*=+",
"",
(x,y,z) => List.Accumulate(
Text.ToList(y),
x,
(s, c) => Text.Remove(s, c)
),
Table.ColumnNames(Source)
)
And the question is - what is quicker by design of PQ?
What creates more overhead:
1. doing all the transformations at once for each row, but spending time PER ROW on preparing data needes for removing (making list from the text ".:;?!<>@#$%^&*=+", in the example above),
2. or prepare the data once (separating the text ".:;?!<>@#$%^&*=+" on chars before), and then go several times through all the rows with every particular char?
Anonymous
Not applicable

@andreyminakov,

You can analyze the power query performance by using trace log. See https://www.excelando.co.il/en/analyzing-power-query-performance-source-large-files/.
 

1.PNG

Regards,
Lydia

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors