Showing results for 
Search instead for 
Did you mean: 

Dataflow Refresh Failures

Dataflow refresh is quite unpredictable.


Today I could refresh three dataflows in one workspace without an error (although one dataflow needed a retry). But I could not refresh three other dataflows with exactly the same entities and the same queries in another workspace . Sadly, the latter workspace is our production environment, so we are in trouble right now.


Whenever I saved changes after an edit, even if I did not change anything, the dataflow might or might not refresh. Sometimes it fails with an 'Internal Error', sometimes with a 2 hour time limit exceeded after 10 minutes? Sometimes editing and saving without changes fixes the issue, sometimes refreshing under another account. But now none of these actions changes the behaviour of the dataflows; they will not refresh anymore. Even worse, from our point of view there does not seem to be any logic behind the occurences or absence of refresh failures.


We switched to dataflows because they should be past the preview phase by now, and they seemed a good solution. We develop our queries in Power BI Desktop, to minimise the number of changes on the actual dataflows. Hence it is still possible to replace all dataflows with the queries from this file using the original data sources. But that requires a lot of rework on all the datasets behind the (already published) reports.


Has anyone simular experiences with dataflows? Does anybody know a remedy for this eractic hebaviour?

Status: New
Advocate II

Hi, I'm having the same issues and have submitted a support ticket.

I even tried creating a new dataflow but that is refusing to refresh as well.



Regular Visitor


We are quite in the same situation. In nov 2019 we swichted several reports to Dataflows, as it was a clean and efficient solution to share the result of queries and to avoid maintaining copies of queries in different .pbix files...

It would be difficult to get back to the original solution. 

Maybe the issue is linked to the following problem which occured on 5 feb 2020

Hope for a quick fix !



Advocate II

Thanks for the comments.


Each night the data is copied from different sources to one Azure SQL Databases with Azure Data Factory Pipelines. This solution works flawless. All dataflows read from this staging database, combining data from several source tables into logical entities.


The dataflow refresh issues remain the same if I try to refresh each dataflow separately. This excludes deadlocks as a cause for the refresh failures. Fetching the data from an Azure SQL Database also excludes the gateway as a cause. Even dataflows that contain only static data without any connection to a data source exhibit the issues mentioned above.

Frequent Visitor



I am having the same issue since this morning. None of my dataflow can be refreshed but so far they've been working well.



Not applicable


Have the same issue here. Even tiny dataflows have this "exceeded 2 hour refresh limit" error.

We use dataflows for 10+ reports in the organization and now none of them refresh.


Please address the issue.



Advocate II

same for me.

Community Support

Hi all, 


I tested on my side but not able to reproduce the issue. Please respecify the valid credential for data source then test again. If the issue still occurs, please create a support ticket to get support. 


Support Ticket.gif


Best Regards,
Qiuyun Yu

Not applicable

Same issue here. Bigquery is my data source. Refreshes fail after 3 minutes with a message saying a 2 hour window has been exceeded. Will post a support ticket.



Advocate I

Same here, errors since yesterday. Please fix!

Frequent Visitor

Same here, switched to dataflows last August to ingest CSV files from SharePoint to replace a massive PBIX with all queries inside, which took hours to refresh (and often failed).

When dataflows work it is much faster than our old solution and the dataflows usually refresh in 10-20 sec. But we also had to switch to another premium capacity in November, as our previous capacity was way overloaded and the dataflows were ver unrelaibale and refresh failed extremely often (>50%).

On the new capacity, the dataflow refresh failure is about 10-15% now, but at least the dataset refresh is doing very well >98% refresh success. What makes it worse is that there is no useful error message in the refresh history to indicate why dataflows fail. Every day something else is causing trouble, even if we have not touched any of the query code. Very unreliable user experience.

Also saving dataflows after editing is a pain, as there are all kind of error messages when trying to save a dataflow. Again, none of the save error helps to indicate what needs to be done to save sucessfully. 


So pathetic!

Definitely room for improvement for an otherwise compelling tool set.