Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
VenkateshV
Regular Visitor

Dataflow Gen2 Refresh Fails – "User not authorized for Datamart" Error when Loading from On-premises

We are currently facing a persistent issue when using Dataflow Gen2 to copy data from an on-premises SQL Server table into a pre-created table within a Microsoft Fabric Data Warehouse (Lakehouse).

 

  • The dataflow is created successfully, all steps including source connection and destination mapping are completed.

  • Publishing the dataflow is successful.

  • However, when we trigger the refresh, it fails with the following error:

 

Error details.

Refresh error -PickList: Error Code: Mashup Exception Data Source Error, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Failed to insert a table., InnerException: Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0 and Batch Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0@c89a0539-465c-4d72-8a34-d40846b48769$2025-04-23T15:02:29.0297905Z@69c034cb-c2cf-483c-adda-60685ebdfa5f. underlying error code: 'DmsPbiServiceUserException', error: [error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={"error":{"code":"DatamartsUserNotAuthorized","pbi.error":{"code":"DatamartsUserNotAuthorized","parameters":{"ErrorMessage":"User not authorized for datamart"},"details":[],"exceptionCulprit":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]], Underlying error: Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0 and Batch Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0@c89a0539-465c-4d72-8a34-d40846b48769$2025-04-23T15:02:29.0297905Z@69c034cb-c2cf-483c-adda-60685ebdfa5f. underlying error code: 'DmsPbiServiceUserException', error: [error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={"error":{"code":"DatamartsUserNotAuthorized","pbi.error":{"code":"DatamartsUserNotAuthorized","parameters":{"ErrorMessage":"User not authorized for datamart"},"details":[],"exceptionCulprit":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]] Details: Reason = DataSource.Error;ErrorCode = Lakehouse036;Message = Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0 and Batch Id: cfb7a866-1f0c-4306-8081-4c2caf02d6b0@c89a0539-465c-4d72-8a34-d40846b48769$2025-04-23T15:02:29.0297905Z@69c034cb-c2cf-483c-adda-60685ebdfa5f. underlying error code: 'DmsPbiServiceUserException', error: [error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={"error":{"code":"DatamartsUserNotAuthorized","pbi.error":{"code":"DatamartsUserNotAuthorized","parameters":{"ErrorMessage":"User not authorized for datamart"},"details":[],"exceptionCulprit":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]];Detail = [error = [...]];Message.Format = Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: #{0} and Batch Id: #{1}. underlying error code: '#{2}', error: #{3};Message.Parameters = {"cfb7a866-1f0c-4306-8081-4c2caf02d6b0", "cfb7a866-1f0c-4306-8081-4c2caf02d6b0@c89a0539-465c-4d72-8a34-d40846b48769$2025-04-23T15:02:29.0297905Z@69c034cb-c2cf-483c-adda-60685ebdfa5f", "DmsPbiServiceUserException", "[error=[code=DmsPbiServiceUserException,pbi.error=[code=DmsPbiServiceUserException,parameters=[ErrorMessage={""error"":{""code"":""DatamartsUserNotAuthorized"",""pbi.error"":{""code"":""DatamartsUserNotAuthorized"",""parameters"":{""ErrorMessage"":""User not authorized for datamart""},""details"":[],""exceptionCulprit"":1}}},HttpStatusCode=400],details={},exceptionCulprit=1]]]"};ErrorCode = Lakehouse045;Microsoft.Data.Mashup.Error.Context = System GatewayObjectId: 3edba8b7-2170-4fd2-8d53-0f6856adba8f (Request ID: f63e6ae1-5209-45b5-a793-1cc9a4e84bd3).

Environment Details:

  • Gateway version: April 2025 (Version: 3000.266.4) – freshly updated

  • Source: On-premises SQL Server (connectivity confirmed, SELECT privileges granted)

  • Target: Pre-created Fabric Data Warehouse table

  • User Role: Admin privileges on Fabric workspace

  • Authentication: Correctly configured in both source and destination

  • Data Gateway: Online and mapped correctly to the data source

 

What We Have Tried:

  • Verified and tested on-premises SQL Server connection through the gateway

  • Ensured that all relevant permissions are granted (read access to SQL source, write access to DW destination)

  • Confirmed that the user initiating the refresh has Admin privileges within Fabric

  • Attempted refreshing after revalidating credentials and re-mapping the destination

 

Ask:

Despite fulfilling all the prerequisites and configurations, the refresh consistently fails with a "DatamartsUserNotAuthorized" error. We are not using Datamarts — the target is a Fabric Data Warehouse (Lakehouse).

We would appreciate any suggestions, insights, or known workarounds that could help resolve this issue.

Thank you in advance for your support!

 

 

1 ACCEPTED SOLUTION
v-hashadapu
Community Support
Community Support

Hi @VenkateshV , Thank you for reaching out to the Microsoft Community Forum.

 

This is likely due to either missing permissions on the Lakehouse's SQL Analytics Endpoint or a metadata sync issue. First, check your SQL permissions. Even as a workspace admin, you might not have SQL-level rights. Use SSMS to connect to the Lakehouse's SQL Endpoint and confirm you have CONTROL, ALTER, or INSERT permissions on the target table. If not, have your Fabric admin grant them through SSMS or the portal. Next, force a metadata sync by querying the table in SSMS. This activates the endpoint and resolves many sync-related refresh issues.

 

Also, review your Dataflow setup. Make sure the destination is set to “Lakehouse,” not “Datamart,” and that the schema exactly matches the pre-created table. Staging must be enabled under Query Settings -> Staging. If issues persist, try writing to a new table created directly by the Dataflow. If that doesn't help, try deleting and recreating the Lakehouse connection under Manage Connections and Gateways, then reconfigure your Dataflow and republish.

 

As a workaround, use a Fabric Notebook with Spark to load the data directly into the Lakehouse. It’s a reliable fallback if Dataflow continues to fail.

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

View solution in original post

5 REPLIES 5
venke
New Member

Hi,

 

I tied to load data using PySpark Notebook. But getting the below error. 

 

Error details

 
 
File /opt/spark/python/lib/pyspark.zip/pyspark/sql/readwriter.py:307, in DataFrameReader.load(self, path, format, schema, **options)
    305     return self._df(self._jreader.load(self._spark._sc._jvm.PythonUtils.toSeq(path)))
    306 else:
--> 307     return self._df(self._jreader.load())
 
File ~/cluster-env/trident_env/lib/python3.10/site-packages/py4j/java_gateway.py:1322, in JavaMember.__call__(self, *args)
   1316 command = proto.CALL_COMMAND_NAME +\
   1317     self.command_header +\
   1318     args_command +\
   1319     proto.END_COMMAND_PART
   1321 answer = self.gateway_client.send_command(command)
-> 1322 return_value = get_return_value(
   1323     answer, self.gateway_client, self.target_id, self.name)
   1325 for temp_arg in temp_args:
   1326     if hasattr(temp_arg, "_detach"):
 
File /opt/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py:169, in capture_sql_exception.<locals>.deco(*a, **kw)
    167 def deco(*a: Any, **kw: Any) -> Any:
    168     try:
--> 169         return f(*a, **kw)
    170     except Py4JJavaError as e:
    171         converted = convert_exception(e.java_exception)
 
File ~/cluster-env/trident_env/lib/python3.10/site-packages/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
    324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
    325 if answer[1] == REFERENCE_TYPE:
--> 326     raise Py4JJavaError(
    327         "An error occurred while calling {0}{1}{2}.\n".
    328         format(target_id, ".", name), value)
    329 else:
    330     raise Py4JError(
    331         "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
    332         format(target_id, ".", name, value))
 
Py4JJavaError: An error occurred while calling o4638.load.
: org.apache.spark.SparkClassNotFoundException: [DATA_SOURCE_NOT_FOUND] Failed to find the data source: <SQL SERVER.....>  Please find packages at `https://spark.apache.org/third-party-projects.html`.
at org.apache.spark.sql.errors.QueryExecutionErrors$.dataSourceNotFoundError(QueryExecutionErrors.scala:738)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:648)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSourceV2(DataSource.scala:698)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:216)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.ClassNotFoundException: UAT-SQL-01.DefaultSource
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:594)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:527)
at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$5(DataSource.scala:634)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$4(DataSource.scala:634)
at scala.util.Failure.orElse(Try.scala:224)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:634)
v-hashadapu
Community Support
Community Support

Hi @VenkateshV , Please let us know if your issue is solved. If it is, consider marking the answer that helped 'Accept as Solution', so others with similar queries can find it easily. If not, please share the details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @VenkateshV , Please let us know if your issue is solved. If it is, consider marking the answer that helped 'Accept as Solution', so others with similar queries can find it easily. If not, please share the details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @VenkateshV , Please let us know if your issue is solved. If it is, consider marking the answer that helped 'Accept as Solution', so others with similar queries can find it easily. If not, please share the details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @VenkateshV , Thank you for reaching out to the Microsoft Community Forum.

 

This is likely due to either missing permissions on the Lakehouse's SQL Analytics Endpoint or a metadata sync issue. First, check your SQL permissions. Even as a workspace admin, you might not have SQL-level rights. Use SSMS to connect to the Lakehouse's SQL Endpoint and confirm you have CONTROL, ALTER, or INSERT permissions on the target table. If not, have your Fabric admin grant them through SSMS or the portal. Next, force a metadata sync by querying the table in SSMS. This activates the endpoint and resolves many sync-related refresh issues.

 

Also, review your Dataflow setup. Make sure the destination is set to “Lakehouse,” not “Datamart,” and that the schema exactly matches the pre-created table. Staging must be enabled under Query Settings -> Staging. If issues persist, try writing to a new table created directly by the Dataflow. If that doesn't help, try deleting and recreating the Lakehouse connection under Manage Connections and Gateways, then reconfigure your Dataflow and republish.

 

As a workaround, use a Fabric Notebook with Spark to load the data directly into the Lakehouse. It’s a reliable fallback if Dataflow continues to fail.

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.