Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I was trying to backup entire warehouse and had following questions:
Is it possible to clone a fabric warehouse in another workspace on a scheduled basis in an automated way?
Is there a way to backup Fabric warehouse?
Solved! Go to Solution.
Hi @PriyaJha, As far as I am aware, there's no official inbuilt feature to backup entire warehouse into different workspace. However, there is a workaround to achieve same goal using Data Pipeline. You can define the metadata (name, schema, deltacolumn etc.) of entire warehouse into a table and then pull that configuration via Lookup activity. Use For each activity to loop through all the tables and copy activity (inside For each) to copy the data of A warehouse (workspace A) to B warehouse( workspace B). You would need dynamic content builder to populate source and destination configurations of copy activity (metadata driven).
If you would like to proceed with standard restoration (System-created, User defined), there are some limitations.
Data warehouse restoration: https://learn.microsoft.com/en-us/fabric/data-warehouse/restore-in-place
Just an FYI - I've used hard coded values to test it.
I use the deployment pipeline for copying lakehouses and warehouses from one workspace to another workspace. Although this is not the primary design target for the deployment pipeline, but it can do some similar backup tasks.
It supports selective deployment so you can select a specific warehouse for deploying to another workspace. You can also automate the deployment by using PowerShell or Fabric REST APIs. I haven't automated the process so I cannot provide more detailed suggestions about it. You may google it and give it a try!
I use the deployment pipeline for copying lakehouses and warehouses from one workspace to another workspace. Although this is not the primary design target for the deployment pipeline, but it can do some similar backup tasks.
It supports selective deployment so you can select a specific warehouse for deploying to another workspace. You can also automate the deployment by using PowerShell or Fabric REST APIs. I haven't automated the process so I cannot provide more detailed suggestions about it. You may google it and give it a try!
Hi @PriyaJha, As far as I am aware, there's no official inbuilt feature to backup entire warehouse into different workspace. However, there is a workaround to achieve same goal using Data Pipeline. You can define the metadata (name, schema, deltacolumn etc.) of entire warehouse into a table and then pull that configuration via Lookup activity. Use For each activity to loop through all the tables and copy activity (inside For each) to copy the data of A warehouse (workspace A) to B warehouse( workspace B). You would need dynamic content builder to populate source and destination configurations of copy activity (metadata driven).
If you would like to proceed with standard restoration (System-created, User defined), there are some limitations.
Data warehouse restoration: https://learn.microsoft.com/en-us/fabric/data-warehouse/restore-in-place
Just an FYI - I've used hard coded values to test it.
Based on my understanding, unfortunately it is not possible to clone the fabric warehouse in another workspace. the restore points that are created, can be restored /overwritten on the existing warehouse only
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
5 | |
3 | |
3 | |
3 | |
2 |