This time we’re going bigger than ever. Fabric, Power BI, SQL, AI and more. We're covering it all. You won't want to miss it.
Learn moreDid you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now
I recently created a data flow and connected it to the lakhouse. As I create more reports I am having to put additional tables in my data flow. I am now noticing that my data flow is taking a while to upload. Just a question on best practices. What is the maximum number of tables to put in a dataflow. How many dataflows can I have under the premium license and am I supposed to just create other dataflows and place destination at the same lakehouse then do the data modeling in the lakehouse to connect data points. I'm new so your advise will be helpful.
Solved! Go to Solution.
I think there is a maximum of 50 tables per Data Flow. If you have that many and are worried about performance, I suggest you break up the transformations into a second step, like in a stored procedure that moves data from the lakehouse to a warehouse. This frees up the Data Flow to be a straight-foward 'move data as fast as you can' operation.
Look into the 'medallion architecture' philosphy.
Proud to be a Super User! | |
Hello @Anonymous ,
Single dataflow has a limit of 50 tables.
Reference: Power Query Online Limits
If you have a large number of tables, I recommend implementing staging tables for optimal performance.
Note: Fully query-folded operations significantly improve performance.
More details on query folding:
Boosting Power BI Performance with Query Folding Techniques
The Fast Copy feature in Gen2 Dataflows enhances performance. However, please note that Cloud Gateway does not support Fast Copy.
More details on Fast Copy:
Accelerating Data Loads with Fast Copy in Fabric Dataflows
Kind Regards,
Gökberk Uzuntaş
📌 If this post helps, then please consider Accepting it as a solution and giving Kudos — it helps other members find answers faster!
🔗 Stay Connected:
📘 Medium |
📺 YouTube |
💼 LinkedIn |
📷 Instagram |
🐦 X |
👽 Reddit |
🌐 Website |
🎵 TikTok |
I think there is a maximum of 50 tables per Data Flow. If you have that many and are worried about performance, I suggest you break up the transformations into a second step, like in a stored procedure that moves data from the lakehouse to a warehouse. This frees up the Data Flow to be a straight-foward 'move data as fast as you can' operation.
Look into the 'medallion architecture' philosphy.
Proud to be a Super User! | |
Check out the April 2026 Power BI update to learn about new features.
Sign up to receive a private message when registration opens and key events begin.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 3 | |
| 3 | |
| 3 | |
| 2 | |
| 2 |
| User | Count |
|---|---|
| 7 | |
| 5 | |
| 4 | |
| 4 | |
| 3 |