Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hi, we have a lakehouse and warehouse. In lakehouse folders with list of parquet fiels. Idea is we wanted to access these files and copy them into a warehouse table (Append). I know how to read folders reading all .parquet files using spark.read.parquet option however how do I write these files into a table in different warehouse. We know how to create table in same lakehouse but we wanted to create table and copy data in to different warehouse. Please help, below the folder structure for example.
Lakehouse1 -> Files -> MasterFolder->ChildFolder->(list of .parquet files)
Warehouse -> Schema -> dbo. -> Tables ->Tablename
Thank you.
Solved! Go to Solution.
Once you have the DataFrame from the lakehouse, you can write it to a warehouse table residing in a different workspace using the Fabric Spark connector for Data Warehouse.
If you want to append the data to an existing table (or create the table if it does not exist), you can use the `synapsesql` method with a specified write mode.
df.write.mode("append").synapsesql("<warehouse_name>.dbo.Tablename")
If your warehouse is in a different workspace from the one running your notebook, specify the target workspace ID using an appropriate option. For example, you can add:
df.write.option(Constants.WorkspaceId, "<target_workspace_id>").mode("append").synapsesql("<warehouse_name>.dbo.Tablename")
It worked well, appreciate for help.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.