October 28 & 29: Experts share their secrets on how to pass the Fabric Analytics Engineer certification exam—live. Learn more
Is there anyone who is not able to refresh dataflows? I get the spinning circles next to the DF name after publishing it, but it dies off (no spinning circles next to the date). If I try to refresh from the workspace, nothing happens. For me, it started this morning (last night was fine). I am on US central time, in the US north central Fabric region.
Solved! Go to Solution.
Hi @ebjim , All,
We are seeing an increase in failures related to upcoming billing and thorttling changes to Fabric artifact's handling of usage > than 24 hours of capacity. At the moment, this applies to US North and UK regions and we're rolling back changes that react to over used capacities ASAP to provide relief in the short term.
I recommend reading the following article, and updating your capacity Metrics apps to better prepare for the eventual enablement of throttling.
We are also accelerating better handling and experiences to help Dataflow customers understand when capacities are over-loaded.
Thank you,
Ben
Having the same issue, refreshing a Gen2 shows no change in refresh history. All pipelines using Gen2 connection show "in progress" but only for Gen2 and never changes, but actual connection shows untouched. Everything else refreshes fine.
Even got a notification that a Gen2 refresh failed this morning but no reason or signs that dataflow was actually triggered in refresh history.
This started yesterday morning around 8am for us, flows were working right up to that point.
Ticket made with Microsoft but no resolution yet.
As of 8am EST today, the issue resolved itself and we are getting successful refreshes again.
No changes were made from our side.
Seems like this issue solved itself. Had comms with Microsoft and could not find any issues.
For future reference, as suggested by Microsoft:
In these cases, as a next action, we would proceed to collect Browser Traces while recreating the issue for further investigation. Which I think is not possible now.
So, if you see this kind of behavior in future, please try to collect the traces in below way and provide us:
Navigate to ellipse icon on the browser (3 horizontal/vertical dots)
> More tools > Developer tools > Then find network tab > clear all the pre collected traces (if any) by clicking on clear option
> on the same line Enable preserve log and Disable cache > Then start try refreshing the dataset till it throws an error. > Then export the logs by clicking on export HAR. (download Icon)
Hi @ebjim , All,
We are seeing an increase in failures related to upcoming billing and thorttling changes to Fabric artifact's handling of usage > than 24 hours of capacity. At the moment, this applies to US North and UK regions and we're rolling back changes that react to over used capacities ASAP to provide relief in the short term.
I recommend reading the following article, and updating your capacity Metrics apps to better prepare for the eventual enablement of throttling.
We are also accelerating better handling and experiences to help Dataflow customers understand when capacities are over-loaded.
Thank you,
Ben
@Anonymous Would the failures you're referring to include problems with Warehouses (e.g. the inability of Warehouse UI to launch, long running queries terminating with no results, etc.) and Lakehouses (e.g. slow to no synchronization of SQL endpoint with the Lakehouse)?
hi folks,
You can check the status of any service outtages, degradation or issues by visiting the Status website (link below):
https://support.fabric.microsoft.com/support/
In that same page you should be able to find the links to raise a support ticket. I can't really tell what could be happening for your scenario, but the best way is to have an engineer have a closer look at it to determine what could be happening and provide a fix.
You should be able to refresh your Dataflow.
I am having the same problem. Saw one other post on reddit, but other than that, I have not seen any acknowledgement that this is a widespread issue.
The Reddit post titled "Data Flow Gen 2 Stops refreshing without error" was also me. Shows you how deperate I am to get it working.
Same here - Dataflows Gen 2 are not refreshing whether they are triggered manually or from within a Pipeline. This is also true across all our workspaces
I hope the forum admins take note. If this is systemic, a support ticket would not suffice.
User | Count |
---|---|
8 | |
5 | |
4 | |
4 | |
3 |
User | Count |
---|---|
7 | |
5 | |
4 | |
3 | |
2 |