Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi.
I have several dataflow gen2 (not CI/CD) orchestrated in a pipeline. The dataflows that get data from a Fabric Lakehouse, process it, and write it to a Fabric Warehouse have since July 31st stopped working. Error message is (shortened):
Error Code: DmsPbiServiceUserException, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Failed to insert a table., InnerException: Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure
underlying error code: 'DmsPbiServiceUserException', error: [error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={"error":{"code":"DatamartsUserNotAuthorized","pbi.error":{"code":"DatamartsUserNotAuthorized","parameters":{"ErrorMessage":"User not authorized for datamart"},"details":[],"exceptionCulprit":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]], Underlying error: Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure
So without any changes to the dataflows, they suddenly stopped working after running without errors for a long time.
I have tried:
1) Open the dataflow and just click "Publish" again.
2) Open the dataflow, remove data destinations and re-done data destionations.
3) Saved a template of the dataflow and created a new dataflow based on the template.
4) Having a coworker take over the dataflow and try both publish and then recreation.
5) Deleting the StagingWarehouseForDataflows and then publish the dataflow.
Nothing helped, still same error.
Please advice.
Solved! Go to Solution.
After contact with support, the issue is resolved, quote: "The issue stemmed from a combination of authorization and metadata refresh failures within the Fabric Lakehouse SQL catalog. Specifically, the error code DmsPbiServiceUserException indicated that the mashup engine was unable to refresh metadata and create tables due to a breakdown in schema resolution. This was further compounded by an inner exception stating that the user was “not authorized for datamart,” which pointed to a loss of access permissions—likely due to token expiration, backend policy updates.
Additionally, the mashup engine encountered privacy isolation conflicts when attempting to merge data from multiple sources. These conflicts were triggered by the default privacy settings, which prevented cross-source mashups and led to failures in sending data to the destination.
To resolve the issue, we implemented several corrective actions. We created a new dataflow with CI/CD support, while updating the credentials restored proper authentication across data sources. Activating the “Merge data sources” option in the privacy settings allowed the mashup engine to perform cross-source operations without isolation errors. "
So in short: clicked the dataflow ... and "Save as CI/CD", made sure the connections were logged in with my user, and that Options -> Privacy -> Merge checkbox checked.
The previous dataflow gen2 with CI/CD not being able to be used in a pipeline thing is no more, so now a dataflow gen2 with CI/CD can be used in a pipeline dataflow activity, which enabled me to use the new CI/CD versions in the scheduling pipeline.
After contact with support, the issue is resolved, quote: "The issue stemmed from a combination of authorization and metadata refresh failures within the Fabric Lakehouse SQL catalog. Specifically, the error code DmsPbiServiceUserException indicated that the mashup engine was unable to refresh metadata and create tables due to a breakdown in schema resolution. This was further compounded by an inner exception stating that the user was “not authorized for datamart,” which pointed to a loss of access permissions—likely due to token expiration, backend policy updates.
Additionally, the mashup engine encountered privacy isolation conflicts when attempting to merge data from multiple sources. These conflicts were triggered by the default privacy settings, which prevented cross-source mashups and led to failures in sending data to the destination.
To resolve the issue, we implemented several corrective actions. We created a new dataflow with CI/CD support, while updating the credentials restored proper authentication across data sources. Activating the “Merge data sources” option in the privacy settings allowed the mashup engine to perform cross-source operations without isolation errors. "
So in short: clicked the dataflow ... and "Save as CI/CD", made sure the connections were logged in with my user, and that Options -> Privacy -> Merge checkbox checked.
The previous dataflow gen2 with CI/CD not being able to be used in a pipeline thing is no more, so now a dataflow gen2 with CI/CD can be used in a pipeline dataflow activity, which enabled me to use the new CI/CD versions in the scheduling pipeline.
Hi @nioj2S ,
Thanks for confirming the resolution. Glad to hear it’s working now, and thanks to the support team for helping address the issue.
This will be helpful for others facing similar errors, especially now that Dataflow Gen2 with CI/CD can be used in a pipeline activity without the previous limitations.
Please continue using Fabric Community for further queries.
Regards,
Vinay kumar.
Hi @nioj2S ,
Thanks for reaching out to Microsoft Fabric Community.
Based on the error details, the failure appears to be related to a permissions or authorization issue when Dataflow Gen2 attempts to write to the Warehouse.
Just checking in to see if this has now been resolved, as it might have been a temporary glitch.
Have you had a chance to raise a support ticket with Microsoft and get a resolution? If so, please share the outcome here for the benefit of the wider community.
Thanks @miguel for addressing this earlier.
Please reach out for further assistance.
Thank you.
Hi! Still experiencing the issue, have an open ticket with MS Support, so no resolution yet. I did try to troubleshoot some more myself and noticed that when I create a dataflow that gets data from the lakehouse and puts it into the warehouse it works, but when I create a dataflow that gets data from the warehouse and puts it into another table in the warehouse it fails. (I have provided this info to support as well.)
Hi @nioj2S ,
Thanks for the update. The behavior you described, where writing from Lakehouse to Warehouse works but Warehouse-to-Warehouse fails, aligns with the DatamartsUserNotAuthorized error seen earlier. It’s possible the operation is being blocked when the target and source are both within the Warehouse SQL endpoint.
There’s a similar discussion here where the issue was linked to staging artifacts being deleted. Staging lakehouses and other staging artifacts act as system-managed components and should not be removed, as this can lead to failures that may require recreating the workspace to restore functionality.
Since you’ve already provided your scenario to support, they should be able to confirm if this is related. Hope the support team is able to get it resolved soon. Please share the outcome once they get back to you, so others facing similar scenarios can benefit.
Thank you.
No changes were made in our workspace, the dataflows just suddenly stopped working. We tried deleting the visible staging warehouses after several days of failed dataflows, but it did nothing. But then again, if a staging warehouse (or lakehouse) is visible in the root folder of the workspace, sa far as I know, that means it is not connected to a dataflow. The staging warehouse/lakehouse that dataflows actually use are usually hidden as to not being accidentally deleted. If one wants to delete staging warehouse, one has to either disconnect all the dataflows using it, or delete those dataflows, and then the staging warehouse stops being hidden and is visible in the root folder.
So, chances are that we just deleted a un-used staging warehouse. No idea why those keep popping up though, but they have names like "StagingWarehouseForDataflows_20250811999999".
Hi @nioj2S ,
Thanks for the details. That fits the expected behavior - a staging warehouse or lakehouse visible in the workspace root is usually not attached to any active dataflow, so deleting visible but unused staging artifacts will not affect running dataflows.
But for an active dataflow, the staging warehouse is still visible in the OneLake data catalog, where it can be deleted - doing so may cause issues.
For details: staging artifacts
Since you’ve already provided your scenario to support, they should be able to confirm if this is related.
Hope the support team is able to get it resolved soon.
Could you please raise a support ticket so an engineer can take a closer look at your scenario and help troubleshoot the situation?
Below is the link to raise the support ticket:
http://support.fabric.microsoft.com/support
If anyone is also experiencing this issue, pleas feel free to comment on this thread and also raise a support ticket.
I am also encounterng the same issue.