Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Did you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now

Reply
Anonymous
Not applicable

Gen2 DataFlows

I recently created a data flow and connected it to the lakhouse. As I create more reports I am having to put additional tables in my data flow. I am now noticing that my data flow is taking a while to upload. Just a question on best practices. What is the maximum number of tables to put in a dataflow. How many dataflows can I have under the premium license and am I supposed to just create other dataflows and place destination at the same lakehouse then do the data modeling in the lakehouse to connect data points. I'm new so your advise will be helpful.

 

1 ACCEPTED SOLUTION
ToddChitt
Super User
Super User

I think there is a maximum of 50 tables per Data Flow. If you have that many and are worried about performance, I suggest you break up the transformations into a second step, like in a stored procedure that moves data from the lakehouse to a warehouse. This frees up the Data Flow to be a straight-foward 'move data as fast as you can' operation. 

Look into the 'medallion architecture' philosphy.




Did I answer your question? If so, mark my post as a solution. Also consider helping someone else in the forums!

Proud to be a Super User!





View solution in original post

2 REPLIES 2
uzuntasgokberk
Super User
Super User

Hello @Anonymous ,

Single dataflow has a limit of 50 tables.
Reference: Power Query Online Limits

Best Practices for Managing Multiple Tables in Dataflows

If you have a large number of tables, I recommend implementing staging tables for optimal performance.

  • If your data source is SQL and some steps support query folding, you can create an initial dataflow to take advantage of folding.
  • After that, you can link this dataflow to a new one for additional transformations that do not support query folding.

Note: Fully query-folded operations significantly improve performance.
More details on query folding:
Boosting Power BI Performance with Query Folding Techniques

Fast Copy in Dataflow Gen2

The Fast Copy feature in Gen2 Dataflows enhances performance. However, please note that Cloud Gateway does not support Fast Copy.

More details on Fast Copy:
Accelerating Data Loads with Fast Copy in Fabric Dataflows

Kind Regards,
Gökberk Uzuntaş

📌 If this post helps, then please consider Accepting it as a solution and giving Kudos — it helps other members find answers faster!

🔗 Stay Connected:
📘 Medium |
📺 YouTube |
💼 LinkedIn |
📷 Instagram |
🐦 X |
👽 Reddit |
🌐 Website |
🎵 TikTok |

 

ToddChitt
Super User
Super User

I think there is a maximum of 50 tables per Data Flow. If you have that many and are worried about performance, I suggest you break up the transformations into a second step, like in a stored procedure that moves data from the lakehouse to a warehouse. This frees up the Data Flow to be a straight-foward 'move data as fast as you can' operation. 

Look into the 'medallion architecture' philosphy.




Did I answer your question? If so, mark my post as a solution. Also consider helping someone else in the forums!

Proud to be a Super User!





Helpful resources

Announcements
April Power BI Update Carousel

Power BI Monthly Update - April 2026

Check out the April 2026 Power BI update to learn about new features.

Fabric SQL PBI Data Days

Data Days 2026 coming soon!

Sign up to receive a private message when registration opens and key events begin.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.