Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.

Reply
bechirbenltaief
Regular Visitor

Error Writing DataFrame to Warehouse via synapsesql – Worked Previously

For the past week, I've been encountering the following error when trying to read from the Lakehouse and write the DataFrame to the Warehouse using synapsesql.
Prior to that, everything was working perfectly.
df.write \
    .mode("overwrite") \
    .synapsesql(f"{silverWarehouse}.{tableSchema}.{silver_table}")
 
##Error
Caused by: com.microsoft.spark.fabric.tds.error.FabricSparkTDSSQLExecQryError: Error executing query-COPY INTO [{Warehouse_name}].[{schema_name}].[{table_name}]
FROM 'abfss://{workspace_id_hyphen}@{workspace_id}.zd9.dfs.fabric.microsoft.com/_system/artifacts/{notbook_id}/5f15f8eb-09f7-4148-aea3-e293ec52726c/user/trusted-service-user/{table_name}2654d7047d3b407281f0513a1950d92e/*.parquet'
WITH(
   FILE_TYPE = 'PARQUET',
   CREDENTIAL = (
       IDENTITY='Shared Access Signature',
       SECRET='?[REDACTED]')). A retry attempt for error code - 13807 isn't expected to change outcome.
at com.microsoft.spark.fabric.tds.utility.FabricTDSSQLUtility$.executeSQLQuery(FabricTDSSQLUtility.scala:397)
at com.microsoft.spark.fabric.tds.utility.FabricTDSSQLUtility$.execQuery(FabricTDSSQLUtility.scala:249)
at com.microsoft.spark.fabric.tds.write.processor.FabricSparkTDSSQLCommitHandler$.commitDataToWarehouse(FabricSparkTDSSQLCommitHandler.scala:116)
... 25 more
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Content of directory on path 
'https://{workspace_id}.zd9.dfs.fabric.microsoft.com/{workspace_id_hyphen}/_system/artifacts/{notbook_id}/5f15f8eb-09f7-4148-aea3-e293ec52726c/user/trusted-service-user/{table_name}2654d7047d3b407281f0513a1950d92e/*.parquet' cannot be listed.
11 REPLIES 11
justinjbird
Resolver I
Resolver I

You have shared the write command in your message, but looking at your error message...
 
##Error
Caused by: com.microsoft.spark.fabric.tds.error.FabricSparkTDSSQLExecQryError: Error executing query-COPY INTO [{Warehouse_name}].[{schema_name}].[{table_name}]
FROM 'abfss://{workspace_id_hyphen}@{workspace_id}.zd9.dfs.fabric.microsoft.com/_system/artifacts/{notbook_id}/5f15f8eb-09f7-4148-aea3-e293ec52726c/user/trusted-service-user/{table_name}2654d7047d3b407281f0513a1950d92e/*.parquet'
WITH(
   FILE_TYPE = 'PARQUET',
   CREDENTIAL = (
       IDENTITY='Shared Access Signature',
       SECRET='?[REDACTED]')). A retry attempt for error code - 13807 isn't expected to change outcome.
at com.microsoft.spark.fabric.tds.utility.FabricTDSSQLUtility$.executeSQLQuery(FabricTDSSQLUtility.scala:397)
at com.microsoft.spark.fabric.tds.utility.FabricTDSSQLUtility$.execQuery(FabricTDSSQLUtility.scala:249)
at com.microsoft.spark.fabric.tds.write.processor.FabricSparkTDSSQLCommitHandler$.commitDataToWarehouse(FabricSparkTDSSQLCommitHandler.scala:116)
... 25 more
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Content of directory on path 

'https://{workspace_id}.zd9.dfs.fabric.microsoft.com/{workspace_id_hyphen}/_system/artifacts/{notbook_id}/5f15f8eb-09f7-4148-aea3-e293ec52726c/user/trusted-service-user/{table_name}2654d7047d3b407281f0513a1950d92e/*.parquet' cannot be listed.

 

It's the read that is failing not the write, would you agree?

 

Have you redacted these paths? The error says '{workspace_id}' is that you redacting the output or has something gone wrong with you compiling your path?

 

It seems that the issue is somewhere else, not in the write snippet you have supplied.

bechirbenltaief
Regular Visitor

Thanks justinjbird, but where can we change the SAS key for OneLake ?
Think is that code mentionned work fine for long time and we don't have any issue. The purpose is to save Dataframe transformed from Lakehouse to Warehouse.
MS Fabric offer "synapsesql" methode to do the staff, it's work fine until five day ago

BhaveshPatel
Community Champion
Community Champion

Hi @bechirbenltaief 

 

This is not assisting us in the first place. What does it mean by silver warehouse? At first, was there a bronze table working before? Can you share some screenshots? 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

You can eaisly reprodure the error, try with notebook to read data from MS Fabric Lakehouse and save it to MS Fabric Warehouse

Thanks "BahveshPatel" for your reponse. Sorry, some data and screenshot are sensitives so i can't share that.

But i can explain again the issue :

We work with medallion architecture, we use spark notebook to read the data from "bronze lakehouse", and then transforme it and finally we wrote the transformed dataframe in "Fabric Warehouse". This warehouse we name it "Silver warehouse".

In the notebook, we use "synapsesql" methode to write the dataframe to the warehouse (the source code shared with you). This methode run without any issue until 5 days ago. indicate that we can not list parquet file, and seems to SAS Key to change, but nothing changed in our environnement Fabric.


Is that clear enough for you ? is that help you to figure out the issue ?

 

Béchir

Hi @bechirbenltaief 

 

Due to change into synapsesql, means that synapse sql should not be used going forward ( depreciate ) we need to use Databricks Delta Lake Open Source table in broze+ silver warehouse. synapse sql is using data lake whereas databricks delta lake is using _delta_log( Transaction log ) with parquet files. You should not use spark notebook for everything.

 

Python --> Apache Spark ( Data Lake ) --> Databricks Delta Lake ( Pyspark initially and subsequent use Spark SQL ) 

 

Also, Lakehouse = Fabric Warehouse = Power BI / SSAS Tabular Semantic Model 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

Thank you, Bhavesh Patel, for your response.

However, your answer seems to be outside the scope of our current setup. We are working entirely within a Microsoft Fabric environment.

The objective is to write data from a Microsoft Fabric Lakehouse to a Microsoft Fabric Warehouse using a Fabric Notebook. Databricks is not our scope.

The synapsesql function was working correctly, and according to Microsoft, it has not been deprecated.

spaceman127
Helper III
Helper III

Hi @bechirbenltaief,

If you have not changed anything so far, then I would also guess that the SAS key needs to be renewed.

Best regards 

Thanks spaceman127 for your hint, but we are in OneLake in Fabric environnement, SAS Key is transparent layer. 

All right,
That was not clear from the original question.
It seemed as if you were asking for a storage account.
That was a misunderstanding.

 

Do you have any more information for us?

 

Best regards 

 

justinjbird
Resolver I
Resolver I

Just ruling simple things...has your SAS key expired by any chance?

Helpful resources

Announcements
August Fabric Update Carousel

Fabric Monthly Update - August 2025

Check out the August 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.