Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now

Reply
Anonymous
Not applicable

Dataflow Timeout

Hi all,

 

Anyone know a way to stop the dataflow timing out.

 

Datasource is on-prem SQL database, using a native query. Result set is in the millions

 

Thanks

 

Mike

 

 

1 ACCEPTED SOLUTION
Galia
Microsoft Employee
Microsoft Employee

Hi Mike,

 

Please send to our support email the screenshot with the failure you got, so we can investigate further.

 

Thanks,

Galia

View solution in original post

17 REPLIES 17
truthisavirus
New Member

The default timeout on the SQL Connector is 10 minutes but there is a parameter in the connector's config where you can specify a longer duration. Here's an example setting the timeout on the command to 30 minutes:

Sql.Database("server", "database", [CommandTimeout = #duration(0, 0, 30, 0)])
rdnguyen
Helper V
Helper V

Not sure if anyone is still concern about this; but you can split the whole datatable in SQL into multiple segments; then the oldest data is loaded overnight one by one. Then at the transforming phase, you can merge them all together using resource of Power BI service. That should work to bring in huge amount of data like you said.

Anonymous
Not applicable

I have the same issue, I have been searching for a solution for the past week.

When trying to refresh a dataflow of small size it is always succesful, but when the size gets bigger it fails in 4 or 6 minutes ( exactly) which makes me think it is a timeout setting.

 

I don't know where to address this issue and get an actual answer.

 

Same issue here.... if I paste M code from advanced editor, sometimes it works when I save & close, and sometimes it still fails saying i have a query error.  The error is that the query timed out.

InsightBob
Frequent Visitor

Can anyone share the solution to this? Not sure the marked solution is actually a solution

I have no solution for the problem, but did use the following workaround. 

 

I created the desired query in PowerBI Desktop.  Then I captured the M code from the advanced editor and pasted it into the editor in Dataflows.  Then I just clicked 'Close and Save' and navigated away without waiting for the preview to finish.

 

The dataflow loaded without problem, but managing the query has to be done out of Power Query Editor, and then copied into the editor in the Dataflows interface.  

 

Not a very satisfying approach, but it is functional.

Thanks for reply. I take that approach with every entity I build, create in desktop then lift and shift the query from advanced editor.

Interestingly I tried to save my entity again after-hours and it worked a treat, so guessing some kind of memory/capacity thing?

I work for a global company and I'm based in UK I was trying to get this done at 2pm today, just as our friends across the pond come online. I had success at 6pm
jjudy
Helper I
Helper I

What is the solution to this? 

v-yuta-msft
Community Support
Community Support

@Anonymous,

 

Please refer to below:

 

There are a few known limitations to using Enterprise Gateways and dataflows:

  • Each dataflow may use only one gateway. As such, all queries should be configured using the same gateway.
  • Changing the gateway impact the entire dataflow.
  • If several gateways are needed, the best practice is to build several dataflows (one for each gateway) and use the compute or entity reference capabilities to unify the data.
  • Dataflows are only supported using enterprise gateways. Personal gateways will not be available for selection in the drop down lists and settings screens.

Community Support Team _ Jimmy Tao

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Anonymous
Not applicable

Hi,

 

We are using an enterprise gateway and just have one gateway for all dataflows.

 

Regards

 

Mike

Galia
Microsoft Employee
Microsoft Employee

Hi Mike,

 

Can you please share the refresh log (by downloading the CSV file from the refresh history)?

Please mail it to dataflow support DL: pbidf@microsoft.com

 

Thanks,

Galia

Anonymous
Not applicable

Hi Galia

 

I don't get to the refresh stage as the dataflow times out when I am previewing the data when building the dataflow.

 

I am unable to save the dataflow.

 

Regards

 

Mike

Galia
Microsoft Employee
Microsoft Employee

Hi Mike,

 

Please send to our support email the screenshot with the failure you got, so we can investigate further.

 

Thanks,

Galia

I am also having a problem like this.  Do we know any more about Time Out control?

 

I am trying to retrieve a portion of a large table - about 10 of 88MM rows.  If I load that data into its own SQL table, the Dataflows priview loads instantly.  But if I try to load from the correct table, DataFlows works at it for about 10 minutes and quits.  

 

I wouldn't expect the query to complete in SSMS in that time.  How can we get Dataflows preview to be a little more patient?

Anonymous
Not applicable

@RBuntingHow are you managing to download such huge dataset to your Desktop? I still dont understand why PowerBI Desktop needs to download the entire Dataflow into the desktop application. I am talking about 10s of GBs of data. Make no sense.

Power BI -Dataflow.PNG
Power BI -Dataflow-error.PNG
@v-yuta-msft 
I tried to highlight some of the issues with Dataflow in y post here, but there is ablsutely no response ! My whole Org was couting on using Dataflow for enterprise level ETL.
https://community.powerbi.com/t5/Service/Dataflow-Dataset-NEED-more-Clarity/td-p/735330

@AnonymousI think there's been some misunderstanding.  I am loading about 10 million rows, but that's not anything like 10s of Gigs.

 

The problem in my case is that it simply takes time for SQL Server to pull those records out of the large table.  If they are in a table by themselves they load easily.  It's not a matter of size -- I just wish the Dataflow's time-out when loading the preview would be more generous, or be managed manually.  

 

To avoid this problem, I build the M code in Desktop since it's an easier environment to work in.  But I don't load any of that data into Desktop's model.  I only want the code out of power query.  Then I paste that into the Dataflow's advanced editor.  

 

At this point, waiting for the preview would create a time-out error.  So I don't wait.  I just save and close, confident from my preview in desktop that I'm getting the data I need.  Naturally, I'll do some QA on the dataflow once it's properly loaded.

 

Anonymous
Not applicable

Got it. So you are still in the first stage of frustation.
Once the dataflow is loaded, you will experience what I am going through! Having to load the complete data to desktop in my case its 54M rows! 🙂

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors