Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
I want to know that How can I transfer the lakehouse from one workspace to another that containg table files and many more things. Explain me in the brief , also provide me the documentation and videos link to understand this also.
Solved! Go to Solution.
Hi @Naman_
Transferring a Lakehouse in Microsoft Fabric from one workspace to another involves specific methods, as direct transfer is not natively supported.
Using OneLake Shortcuts
Instead of duplicating data, you can create a shortcut in the destination workspace that points to the source Lakehouse. This allows users to access data without physically moving it:
• Go to the destination workspace.
• Create a new shortcut pointing to the source Lakehouse’s folder or tables.
Using Data Pipelines
Microsoft Fabric’s Data Factory allows you to copy data between Lakehouses in different workspaces using pipelines:
• Create a new pipeline in the destination workspace.
• Use the Copy Activity to specify the source Lakehouse (from the original workspace) and the destination Lakehouse.
• Configure the data mapping (e.g., tables or files) and run the pipeline.
Deployment Pipelines
For a more automated approach, deployment pipelines can be used to transfer artifacts between workspaces:
• Configure a deployment pipeline that includes your source Lakehouse.
• Deploy it into the target workspace while retaining configurations like table names
Limitations
• Direct cross-workspace copying of files or tables is not supported natively due to security restrictions.
• Some methods (e.g., shortcuts) may only provide read access rather than full duplication.
• Large datasets may require additional time and resources for transfer.
Hope this helps
please accept the solution if you found it useful
Hi @Naman_
Thank you for reaching out to the Microsoft forum community
As mentioned @nilendraFabric , @timahenning2 in their post, we have multiple approaches to achieve the transfer of the Lakehouse between different workspaces.
To help you understand, I implemented it using a data pipeline, which facilitates an easy transfer of the Lakehouse between different workspaces.
Please review the steps and screenshots below to achieve the target.
I've created two workspaces in my power bi service.
Created a Lakehouse named Bronze_Insurance_L1 in workspace Fabric.
By using the data pipeline activity, we can move the Lakehouse from workspace to another
By using the pipeline activity moved the Lakehouse data to workspace power bi named Transfer_1
If this post helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thanks.
Hi @Naman_
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @Naman_
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @Naman_
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.
Thank you.
Hi @Naman_
Thank you for reaching out to the Microsoft forum community
As mentioned @nilendraFabric , @timahenning2 in their post, we have multiple approaches to achieve the transfer of the Lakehouse between different workspaces.
To help you understand, I implemented it using a data pipeline, which facilitates an easy transfer of the Lakehouse between different workspaces.
Please review the steps and screenshots below to achieve the target.
I've created two workspaces in my power bi service.
Created a Lakehouse named Bronze_Insurance_L1 in workspace Fabric.
By using the data pipeline activity, we can move the Lakehouse from workspace to another
By using the pipeline activity moved the Lakehouse data to workspace power bi named Transfer_1
If this post helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thanks.
Hi @Naman_
Transferring a Lakehouse in Microsoft Fabric from one workspace to another involves specific methods, as direct transfer is not natively supported.
Using OneLake Shortcuts
Instead of duplicating data, you can create a shortcut in the destination workspace that points to the source Lakehouse. This allows users to access data without physically moving it:
• Go to the destination workspace.
• Create a new shortcut pointing to the source Lakehouse’s folder or tables.
Using Data Pipelines
Microsoft Fabric’s Data Factory allows you to copy data between Lakehouses in different workspaces using pipelines:
• Create a new pipeline in the destination workspace.
• Use the Copy Activity to specify the source Lakehouse (from the original workspace) and the destination Lakehouse.
• Configure the data mapping (e.g., tables or files) and run the pipeline.
Deployment Pipelines
For a more automated approach, deployment pipelines can be used to transfer artifacts between workspaces:
• Configure a deployment pipeline that includes your source Lakehouse.
• Deploy it into the target workspace while retaining configurations like table names
Limitations
• Direct cross-workspace copying of files or tables is not supported natively due to security restrictions.
• Some methods (e.g., shortcuts) may only provide read access rather than full duplication.
• Large datasets may require additional time and resources for transfer.
Hope this helps
please accept the solution if you found it useful
These are really good options. In addtion, one more idea is to use Azure Data Studio to compare SQL objects. You can easily determine the differences using SQL endpoints, create a TSQL script, and run it to create/update the base schema.
User | Count |
---|---|
4 | |
4 | |
2 | |
2 | |
2 |
User | Count |
---|---|
10 | |
8 | |
7 | |
6 | |
6 |