Skip to main content
Showing results for 
Search instead for 
Did you mean: 

The ultimate Microsoft Fabric, Power BI, Azure AI & SQL learning event! Join us in Las Vegas from March 26-28, 2024. Use code MSCUST for a $100 discount. Register Now

Frequent Visitor

How to make Power Query faster?

Does anyone know of a way to work with Power Query in a much faster way?

I am merging 3 different tables, which have infinite rows, because they are tables that come from combinations of several other tables. But, always to do any transformation or merge, it takes a long time to load.

Does anyone know a way to make it faster, so I don't lose so much time?

Super User
Super User

Performance depends on a lot of specific details. If there were optimizations that were universally true, then they would be built in. If your tables have infinite rows, then loading or transforming will take an infinite amount of time regardless of the rate involved.


Assuming you're asking about real-world (finite) load, to help determine if optimizations are possible, we'd need to know what data sources (e.g. CSV, folder, SQL server, SharePoint, etc.) you're working with and the approximate number of rows from each (just the order of magnitude is fine but infinite isn't a real answer). If they're all from a single SQL database, then you're much more likely to be able to make things faster than if you're trying to merge huge tables from separate sources.

Dude, it has about 500,000 rows. I pulled the data through a folder, where there were several CSV files and merged them to have only one table. This final table has more or less 500,000 rows.

I had the same issue in the past. Turned out my issue wasn't with rows as Power BI is more than capable of handling 500k rows. My issue was appending the csv. The more csv files I threw it at, the slower it got regardless of how many rows it was in total. That's just one downside to using a sharepoint folder and appending several files with power query. POwer Query is not that good handling it in my personal experience. 


I've since moved on and started using power automate to append new csv data onto a base csv file and using that csv as source.

@Imrans123 Raw text files like CSV just aren't a very efficient source to load data from but it's definitely faster to load from one big one than from lots of smaller ones. Appending incrementally with Power Automate is a nice way to do the appending once rather than every time you refresh the query. You could get even better query performance if you used Power Automate to load the data to something more like a database with a query engine rather than a text file.

So, but on that particular report you will have no updates.

OK. Half a million rows is likely small enough to buffer the component tables so you can join them in memory. Loading CSVs from a folder will be slow no matter what but if you buffer them after loading, you should be able to do merges and transformations pretty quickly on tables that small.


I'd recommend checking out @parry2k's post here for more detail on buffering:

Helpful resources

Fabric Community Conference

Microsoft Fabric Community Conference

Join us at our first-ever Microsoft Fabric Community Conference, March 26-28, 2024 in Las Vegas with 100+ sessions by community experts and Microsoft engineering.

February 2024 Update Carousel

Power BI Monthly Update - February 2024

Check out the February 2024 Power BI update to learn about new features.

Fabric Career Hub

Microsoft Fabric Career Hub

Explore career paths and learn resources in Fabric.

Fabric Partner Community

Microsoft Fabric Partner Community

Engage with the Fabric engineering team, hear of product updates, business opportunities, and resources in the Fabric Partner Community.

Top Solution Authors
Top Kudoed Authors