The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
We are currently facing a persistent issue when using Dataflow Gen2 to copy data from an on-premises SQL Server table into a pre-created table within a Microsoft Fabric Data Warehouse (Lakehouse).
The dataflow is created successfully, all steps including source connection and destination mapping are completed.
Publishing the dataflow is successful.
However, when we trigger the refresh, it fails with the following error:
Error details.
Refresh error -PickList: Error Code: Mashup Exception Data Source Error, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Failed to insert a table., InnerException: Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0 and Batch Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0@c89a0539-465c-4d72-8a34-d40846b48769$2025-04-23T15:02:29.0297905Z@69c034cb-c2cf-483c-adda-60685ebdfa5f. underlying error code: 'DmsPbiServiceUserException', error: [error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={"error":{"code":"DatamartsUserNotAuthorized","pbi.error":{"code":"DatamartsUserNotAuthorized","parameters":{"ErrorMessage":"User not authorized for datamart"},"details":[],"exceptionCulprit":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]], Underlying error: Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0 and Batch Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0@c89a0539-465c-4d72-8a34-d40846b48769$2025-04-23T15:02:29.0297905Z@69c034cb-c2cf-483c-adda-60685ebdfa5f. underlying error code: 'DmsPbiServiceUserException', error: [error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={"error":{"code":"DatamartsUserNotAuthorized","pbi.error":{"code":"DatamartsUserNotAuthorized","parameters":{"ErrorMessage":"User not authorized for datamart"},"details":[],"exceptionCulprit":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]] Details: Reason = DataSource.Error;ErrorCode = Lakehouse036;Message = Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0 and Batch Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0@c89a0539-465c-4d72-8a34-d40846b48769$2025-04-23T15:02:29.0297905Z@69c034cb-c2cf-483c-adda-60685ebdfa5f. underlying error code: 'DmsPbiServiceUserException', error: [error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={"error":{"code":"DatamartsUserNotAuthorized","pbi.error":{"code":"DatamartsUserNotAuthorized","parameters":{"ErrorMessage":"User not authorized for datamart"},"details":[],"exceptionCulprit":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]];Detail = [error = [...]];Message.Format = Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: #{0} and Batch Id: #{1}. underlying error code: '#{2}', error: #{3};Message.Parameters = {"cfb7a866-1f0c-4306-8081-4c2caf02d6b0", "cfb7a866-1f0c-4306-8081-4c2caf02d6b0@c89a0539-465c-4d72-8a34-d40846b48769$2025-04-23T15:02:29.0297905Z@69c034cb-c2cf-483c-adda-60685ebdfa5f", "DmsPbiServiceUserException", "[error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={""error"":{""code"":""DatamartsUserNotAuthorized"",""pbi.error"":{""code"":""DatamartsUserNotAuthorized"",""parameters"":{""ErrorMessage"":""User not authorized for datamart""},""details"":[],""exceptionCulprit"":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]]"};ErrorCode = Lakehouse045;Microsoft.Data.Mashup.Error.Context = System GatewayObjectId: 3edba8b7-2170-4fd2-8d53-0f6856adba8f (Request ID: f63e6ae1-5209-45b5-a793-1cc9a4e84bd3).
Gateway version: April 2025 (Version: 3000.266.4) – freshly updated
Source: On-premises SQL Server (connectivity confirmed, SELECT privileges granted)
Target: Pre-created Fabric Data Warehouse table
User Role: Admin privileges on Fabric workspace
Authentication: Correctly configured in both source and destination
Data Gateway: Online and mapped correctly to the data source
Verified and tested on-premises SQL Server connection through the gateway
Ensured that all relevant permissions are granted (read access to SQL source, write access to DW destination)
Confirmed that the user initiating the refresh has Admin privileges within Fabric
Attempted refreshing after revalidating credentials and re-mapping the destination
Despite fulfilling all the prerequisites and configurations, the refresh consistently fails with a "DatamartsUserNotAuthorized" error. We are not using Datamarts — the target is a Fabric Data Warehouse (Lakehouse).
We would appreciate any suggestions, insights, or known workarounds that could help resolve this issue.
Thank you in advance for your support!
Solved! Go to Solution.
Hi @VenkateshV , Thank you for reaching out to the Microsoft Community Forum.
This is likely due to either missing permissions on the Lakehouse's SQL Analytics Endpoint or a metadata sync issue. First, check your SQL permissions. Even as a workspace admin, you might not have SQL-level rights. Use SSMS to connect to the Lakehouse's SQL Endpoint and confirm you have CONTROL, ALTER, or INSERT permissions on the target table. If not, have your Fabric admin grant them through SSMS or the portal. Next, force a metadata sync by querying the table in SSMS. This activates the endpoint and resolves many sync-related refresh issues.
Also, review your Dataflow setup. Make sure the destination is set to “Lakehouse,” not “Datamart,” and that the schema exactly matches the pre-created table. Staging must be enabled under Query Settings -> Staging. If issues persist, try writing to a new table created directly by the Dataflow. If that doesn't help, try deleting and recreating the Lakehouse connection under Manage Connections and Gateways, then reconfigure your Dataflow and republish.
As a workaround, use a Fabric Notebook with Spark to load the data directly into the Lakehouse. It’s a reliable fallback if Dataflow continues to fail.
If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.
Hi,
I tied to load data using PySpark Notebook. But getting the below error.
Error details
Hi @VenkateshV , Please let us know if your issue is solved. If it is, consider marking the answer that helped 'Accept as Solution', so others with similar queries can find it easily. If not, please share the details.
Thank you.
Hi @VenkateshV , Please let us know if your issue is solved. If it is, consider marking the answer that helped 'Accept as Solution', so others with similar queries can find it easily. If not, please share the details.
Thank you.
Hi @VenkateshV , Please let us know if your issue is solved. If it is, consider marking the answer that helped 'Accept as Solution', so others with similar queries can find it easily. If not, please share the details.
Thank you.
Hi @VenkateshV , Thank you for reaching out to the Microsoft Community Forum.
This is likely due to either missing permissions on the Lakehouse's SQL Analytics Endpoint or a metadata sync issue. First, check your SQL permissions. Even as a workspace admin, you might not have SQL-level rights. Use SSMS to connect to the Lakehouse's SQL Endpoint and confirm you have CONTROL, ALTER, or INSERT permissions on the target table. If not, have your Fabric admin grant them through SSMS or the portal. Next, force a metadata sync by querying the table in SSMS. This activates the endpoint and resolves many sync-related refresh issues.
Also, review your Dataflow setup. Make sure the destination is set to “Lakehouse,” not “Datamart,” and that the schema exactly matches the pre-created table. Staging must be enabled under Query Settings -> Staging. If issues persist, try writing to a new table created directly by the Dataflow. If that doesn't help, try deleting and recreating the Lakehouse connection under Manage Connections and Gateways, then reconfigure your Dataflow and republish.
As a workaround, use a Fabric Notebook with Spark to load the data directly into the Lakehouse. It’s a reliable fallback if Dataflow continues to fail.
If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.
User | Count |
---|---|
3 | |
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
3 | |
2 | |
2 | |
1 | |
1 |