Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi, we have a lakehouse and warehouse. In lakehouse folders with list of parquet fiels. Idea is we wanted to access these files and copy them into a warehouse table (Append). I know how to read folders reading all .parquet files using spark.read.parquet option however how do I write these files into a table in different warehouse. We know how to create table in same lakehouse but we wanted to create table and copy data in to different warehouse. Please help, below the folder structure for example.
Lakehouse1 -> Files -> MasterFolder->ChildFolder->(list of .parquet files)
Warehouse -> Schema -> dbo. -> Tables ->Tablename
Thank you.
Solved! Go to Solution.
Once you have the DataFrame from the lakehouse, you can write it to a warehouse table residing in a different workspace using the Fabric Spark connector for Data Warehouse.
If you want to append the data to an existing table (or create the table if it does not exist), you can use the `synapsesql` method with a specified write mode.
df.write.mode("append").synapsesql("<warehouse_name>.dbo.Tablename")
If your warehouse is in a different workspace from the one running your notebook, specify the target workspace ID using an appropriate option. For example, you can add:
df.write.option(Constants.WorkspaceId, "<target_workspace_id>").mode("append").synapsesql("<warehouse_name>.dbo.Tablename")
It worked well, appreciate for help.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
5 | |
4 | |
2 | |
2 | |
2 |