Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now

Reply
DALI007
Helper II
Helper II

DataFlowGen2 : Limit of 50 request in a DataFlow and retrieval of tables in bulk

Hello,

I would need your help for two questions, please.

Question1:
My source is an Oracle database. I have a problem with the number of tables to retrieve in a Gen 2 dataflow. In a dataflow, I can store only 50 tables.
Is there a workaround to get more tables? (Please note that I have 500 tables to get back).low.

Question2:
Given the number of tables is significant (500 Tables), is there a possibility to recover several table in one shot instead of selected one by one please.

Thank you in advance for your help.
Regards

1 ACCEPTED SOLUTION
miguel
Community Admin
Community Admin

Only queries that have staging or a data destination count towards the 50 queries per dataflow limit. Any other queries in your Dataflow do not impact you towards that limit. Some quick examples of those are function queries and parameters.

 

I'm not entirely sure that we would recommend trying to use a single Dataflow to load 500 tables. Instead, I'd recommend splitting your logic into multiple Dataflows taking into consideration the limit that we have for Dataflows today.

 

Perhaps another option could be to load your data to Lakehouse using mirroring (once we have support for that data source) or data pipelines.

View solution in original post

3 REPLIES 3
miguel
Community Admin
Community Admin

Only queries that have staging or a data destination count towards the 50 queries per dataflow limit. Any other queries in your Dataflow do not impact you towards that limit. Some quick examples of those are function queries and parameters.

 

I'm not entirely sure that we would recommend trying to use a single Dataflow to load 500 tables. Instead, I'd recommend splitting your logic into multiple Dataflows taking into consideration the limit that we have for Dataflows today.

 

Perhaps another option could be to load your data to Lakehouse using mirroring (once we have support for that data source) or data pipelines.

DALI007
Helper II
Helper II

Hello @miguel,

Thank you for your response.

In fact my goal is to retrieve the 500 tables that exist in my database Oracle via a dataFlow, perform transformations on the tables and store them at last in a lakehouse

I did not understand this point it is possible to clarify more please: "there is no limit to the number of requests without staging and without destination of data that you can have". You mean when the staging mode is off for DataflowGen2, we can recover as much table that we want it please? 

And regarding the bulk recovery of tables in a dataflow you have a solution please (Question2)?

Thank you

miguel
Community Admin
Community Admin

You can only have 50 queries with "staging" or a data destination configured, but there's no limit to how many queries without staging and no data destination you can have.

Are you trying to create a solution with hundreds of tables that load data to a destination or what would your solution encompass?

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

FebFBC_Carousel

Fabric Monthly Update - February 2025

Check out the February 2025 Fabric update to learn about new features.

Feb2025 NL Carousel

Fabric Community Update - February 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors