Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
DALI007
Helper I
Helper I

DataFlowGen2 : Limit of 50 request in a DataFlow and retrieval of tables in bulk

Hello,

I would need your help for two questions, please.

Question1:
My source is an Oracle database. I have a problem with the number of tables to retrieve in a Gen 2 dataflow. In a dataflow, I can store only 50 tables.
Is there a workaround to get more tables? (Please note that I have 500 tables to get back).low.

Question2:
Given the number of tables is significant (500 Tables), is there a possibility to recover several table in one shot instead of selected one by one please.

Thank you in advance for your help.
Regards

1 ACCEPTED SOLUTION
miguel
Community Admin
Community Admin

Only queries that have staging or a data destination count towards the 50 queries per dataflow limit. Any other queries in your Dataflow do not impact you towards that limit. Some quick examples of those are function queries and parameters.

 

I'm not entirely sure that we would recommend trying to use a single Dataflow to load 500 tables. Instead, I'd recommend splitting your logic into multiple Dataflows taking into consideration the limit that we have for Dataflows today.

 

Perhaps another option could be to load your data to Lakehouse using mirroring (once we have support for that data source) or data pipelines.

View solution in original post

3 REPLIES 3
miguel
Community Admin
Community Admin

Only queries that have staging or a data destination count towards the 50 queries per dataflow limit. Any other queries in your Dataflow do not impact you towards that limit. Some quick examples of those are function queries and parameters.

 

I'm not entirely sure that we would recommend trying to use a single Dataflow to load 500 tables. Instead, I'd recommend splitting your logic into multiple Dataflows taking into consideration the limit that we have for Dataflows today.

 

Perhaps another option could be to load your data to Lakehouse using mirroring (once we have support for that data source) or data pipelines.

DALI007
Helper I
Helper I

Hello @miguel,

Thank you for your response.

In fact my goal is to retrieve the 500 tables that exist in my database Oracle via a dataFlow, perform transformations on the tables and store them at last in a lakehouse

I did not understand this point it is possible to clarify more please: "there is no limit to the number of requests without staging and without destination of data that you can have". You mean when the staging mode is off for DataflowGen2, we can recover as much table that we want it please? 

And regarding the bulk recovery of tables in a dataflow you have a solution please (Question2)?

Thank you

miguel
Community Admin
Community Admin

You can only have 50 queries with "staging" or a data destination configured, but there's no limit to how many queries without staging and no data destination you can have.

Are you trying to create a solution with hundreds of tables that load data to a destination or what would your solution encompass?

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.

Top Solution Authors