Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
Matheuspeppers
Frequent Visitor

How to make Power Query faster?

Does anyone know of a way to work with Power Query in a much faster way?

I am merging 3 different tables, which have infinite rows, because they are tables that come from combinations of several other tables. But, always to do any transformation or merge, it takes a long time to load.

Does anyone know a way to make it faster, so I don't lose so much time?

6 REPLIES 6
AlexisOlson
Super User
Super User

Performance depends on a lot of specific details. If there were optimizations that were universally true, then they would be built in. If your tables have infinite rows, then loading or transforming will take an infinite amount of time regardless of the rate involved.

 

Assuming you're asking about real-world (finite) load, to help determine if optimizations are possible, we'd need to know what data sources (e.g. CSV, folder, SQL server, SharePoint, etc.) you're working with and the approximate number of rows from each (just the order of magnitude is fine but infinite isn't a real answer). If they're all from a single SQL database, then you're much more likely to be able to make things faster than if you're trying to merge huge tables from separate sources.

Dude, it has about 500,000 rows. I pulled the data through a folder, where there were several CSV files and merged them to have only one table. This final table has more or less 500,000 rows.

I had the same issue in the past. Turned out my issue wasn't with rows as Power BI is more than capable of handling 500k rows. My issue was appending the csv. The more csv files I threw it at, the slower it got regardless of how many rows it was in total. That's just one downside to using a sharepoint folder and appending several files with power query. POwer Query is not that good handling it in my personal experience. 

 

I've since moved on and started using power automate to append new csv data onto a base csv file and using that csv as source.

@Imrans123 Raw text files like CSV just aren't a very efficient source to load data from but it's definitely faster to load from one big one than from lots of smaller ones. Appending incrementally with Power Automate is a nice way to do the appending once rather than every time you refresh the query. You could get even better query performance if you used Power Automate to load the data to something more like a database with a query engine rather than a text file.

So, but on that particular report you will have no updates.

OK. Half a million rows is likely small enough to buffer the component tables so you can join them in memory. Loading CSVs from a folder will be slow no matter what but if you buffer them after loading, you should be able to do merges and transformations pretty quickly on tables that small.

 

I'd recommend checking out @parry2k's post here for more detail on buffering:

https://community.powerbi.com/t5/Desktop/Learn-how-Table-Buffer-Power-Query-M-function-improved-the/...

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.