March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
A Gen2 Dataflow, very simple, just went into the crapper. So decided to recreate it which didn't take very long to do. At least this newly created Dataflow can be published and after about a minute which really should be subseconds since it's loading three config tables of < 10 rows each from files, it finally processes. No problem. Now just delete the old dataflow and rename the recreated one. However, now it won't let me rename the newly created dataflow. And no, it has nothing to do with trying to rename it to an existing or previously existing name. Tried to rename it to PBIDataflowsSuck and it wouldn't let me name it to that either.
Solved! Go to Solution.
The case # is 2311030040012138. Looks like whatever the issue was has been resolved. Gen 2 Dataflows we create now don't have issues. Some of the dataflows that were created a couple of weeks ago now though are completely toast. Ownership cannot be switched and the owner can't even open the dataflow. Oh the joys of pre-release software.
One of the things noticed when dataflows miraculously started working again was the appearance of a DataFlowsStagingLakehouse that no one on my team had seen before. Doesn't show up in the workspace but is listed as an option when Lakehouse is selected as a data destination.
One thing I'll say is that I typically have zero luck renaming a dataflow when first creating it. For instance, I create a new dataflow, and while in it click on the name in the top left corner. When I try to change it - about 2 minutes later I get an error and the dataflow name reverts to "Dataflow 1" or whatever.
I've taken to creating a new dataflow, dropping a blank query in it, publishing, then using Settings dialog to change the name. This seems to work. Then go back into the dataflow, delete the blank query, and do the real work.
Agree this is very messy and takes a long time (it often takes 1+ minutes to publish a dataflow that literally only has a blank query in it). But it does work always.
Hope this helps,
Scott
It's bad. It's really, really bad. Just started happening yesterday and thank god we don't use this feature extensively in our architecture. Because frankly, the motto in our group is Dataflows suck. Not just running a dataflow, it get a dataflow to execute successfully, we have to create a new dataflow, copy all queries to the new dataflow, and then publish. It runs successfully.....once. If you try to refresh that dataflow again without changing a thing, it fails almost immediately. Opening up the newly created dataflow and trying to Publish that dataflow.....fails. Only thing one can do is create a new dataflow copy everything from the previous dataflow and save the new dataflow....which will execute successfully......once. And only once. Can't publish. Can't rename. Can't refresh. Can't take ownership. Somebody broke something badly...at least in our tenet.
Here's what it looks like where I just created Dataflow-6. Dataflow 3 and about four other dataflows are identical. Ran successfully on initial creation and Publish. Right after the initial refresh, tried to kick off again and failed refresh.
Refresh history on Dataflow-6 looks like the following:
Hi @MartinMason ,
Apologies for the issue you have been facing.
I would request you to please go ahead with Microsoft support for this. Please raise a support ticket on this link: https://support.fabric.microsoft.com/en-US/support/.
Also once you have opened the support ticket , please do share the supportcase# here.
Thanks.
Hi @MartinMason ,
Following back to check whether you got a chance to create a support ticket?
Incase if you created, please do share the supportcase# here.
The case # is 2311030040012138. Looks like whatever the issue was has been resolved. Gen 2 Dataflows we create now don't have issues. Some of the dataflows that were created a couple of weeks ago now though are completely toast. Ownership cannot be switched and the owner can't even open the dataflow. Oh the joys of pre-release software.
One of the things noticed when dataflows miraculously started working again was the appearance of a DataFlowsStagingLakehouse that no one on my team had seen before. Doesn't show up in the workspace but is listed as an option when Lakehouse is selected as a data destination.
Hey! Please do continue working with the support team on the issues that you've mentioned.
We recently made a change to hide the staging Lakehouse from the workspace list. It was shown before, but based on the feedback that we received it was causing confusion so its being hidden from the workspace list and will be hidden from other experiences in the future. This lakehouse is a system artifact used for evaluations needed by the Dataflow Gen2 engine.
Oops. Dataflow that was working now had all the Lakehouse destinations removed. What's going on today? Clearly something is being jacked with on the Power BI service. Other users on my team also cannot take ownership of Dataflows to modify or correct. Please fix ASAP or let us know what the deal is.
Hi @MartinMason ,
Thanks for using Fabric Community and reporting this.
Apologies for the issue you have been facing. I would like to check are you still facing this issue?
It's difficult to tell what could be the reason for this performance. I would request you to wait for sometime and try again.
If the issue still persists, can you please share few more details about the nature of the failure like screenshots? error? sessionID? repro steps?
I would be able to guide you better once you provide these details.
Workaround is to create a new dataflow, rename, then Publish Later. That works. If you rename and Publish Now, it does a Dikembe Mutombo and blocks that shot.
Thanks Martin, you've confirmed my suspicions. We've been seeing the same type of behaviour in our tenant, and it does all seem to stem from the renaming process for dataflows. It's like the dataflow name itself is being used to reference it, instead of an ID behind the scenes, so when you do rename it you need to wait several minutes for everything to sync up again. And if you open up the flow before that sync'ing is done, all hell breaks loose and the flow is, as you say, toast.
Publish Later does seem to bypass the issue, if you wait long enough. Like you I suspect the staging entity behind the scenes is part of the issue.
Very sloppy.
User | Count |
---|---|
3 | |
2 | |
2 | |
1 | |
1 |