Skip to main content
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

New Member

What is the best way to work with a large amount of data with Gen1 Dataflows?

In my company we work with Dataflows Gen1, with a large volume of data, which makes us in some cases have some problems with data loading, as there are many data and we load them in Denodo, which has a maximum loading time (timeout) of 15 minutes, time that runs out in some cases.

To solve this problem we have made partitions by semesters to load the data without the volume of data being so large, but even so there are times when we get the timeout error because there is not enough time to load all the data.

How should we structure these dataflows so that we do not have these problems?
What is the best way to work with a large amount of data with Dataflows Gen1?

Clarification: we have to work with Gen1, since we are not allowed to enter Gen2 yet.

Super User
Super User

Does it need to be a dataflow?  Would a bunch of  CSV or Parquet files work as well?


Also, this: Known issue - Visuals using the Denodo connector might show connection errors - Microsoft Fabric | M...

Helpful resources

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.


Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.