Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Hi
I'm testing the new Copy Job feature to copy data from a local source that has a SQL Server database which I access through a local gateway, the copy destination is a Datawarehouse in Fabric, but I see that I get the following error: The copy pair doesn't support build-in staging storage. Please use customized azure storage.
How can I avoid this error?
I have a premium license per user and my account is enabled in Fabric.
Regards
Orlando
Solved! Go to Solution.
Hi @adminbiSkill ,
You're correct that the new Copy Job feature in Microsoft Fabric currently requires using temporary storage when copying data from a local SQL Server to a warehouse. This is because the feature relies on staging the data before it can be ingested into the warehouse.
The ability to copy data directly to the warehouse without using temporary storage is on the roadmap. While there's no specific ETA yet, it's something that the internal team is working on.
In the meantime, a common workaround is to first load the data into a cloud Lakehouse and then perform another copy activity to move it to the warehouse. This ensures that the data is staged properly before final ingestion.
You can also look at this topic: Solved: On Prem SQL Server to Fabric Warehouse via Copy Da... - Microsoft Fabric Community
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi,
I can confirm Yilongs recommendation. Lakehouse provides you the ability to directly write into destination without staging. Currently, a lot of nice features are not available in data warehouse anyway. I saw that quite a lot of features are coming to data warehouse in Q1 2025, therefore suggesting to work with lake house and even benefit from direct lake, etc.
Regards,
Oktay
Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.
Hi,
I can confirm Yilongs recommendation. Lakehouse provides you the ability to directly write into destination without staging. Currently, a lot of nice features are not available in data warehouse anyway. I saw that quite a lot of features are coming to data warehouse in Q1 2025, therefore suggesting to work with lake house and even benefit from direct lake, etc.
Regards,
Oktay
Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.
Hi @adminbiSkill ,
Based on your reported error message, I think you can do the following:
1. Create an Azure Storage Account.
Go to the Azure portal. Navigate to "Storage accounts" and click "Create". Fill in the required details (subscription, resource group, storage account name, region, etc.). Choose the appropriate performance and replication options for your needs. Click "Review + create" and then "Create".
2. Configure the Storage Account.
Once the storage account is created, go to the storage account overview. Set up the necessary containers or file shares within the storage account.
3. Update Your Copy Pair Configuration:
Modify your copy pair configuration to point to the newly created Azure storage account. Ensure that the connection string or access keys are correctly configured in your application or service that is performing the copy operation.
Currently Copy job is in preview stage now and some of the features are not perfect yet. So I think you can also use Copy Data in Pipeline as an alternative.
To work around this issue, you need to configure the data pipeline to use an external staging area instead of internal workspace storage. For example, for this process, use Azure Blob Storage or Azure Data Lake Storage Gen2.
For detailed information you can refer to this official documentation below: How to copy data using copy activity - Microsoft Fabric | Microsoft Learn
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thanks for the reply.
According to what you've explained, if I use the new Copy Job feature to copy data from a local SQL Server (using a gateway) and having a warehouse in Fabric as the destination, this feature not allow me to copy directly to the warehouse. I understand that the feature is in preview.
Do you know if we will have this possibility of copying without using temporary storage?
As an alternative, you suggest that I use Copy Data in Pipeline. I had tried this method before, but I don't think I had incremental copying and it always append data, which didn't work for me.
Thanks you
Best regards
Orlando
Hi @adminbiSkill ,
You're correct that the new Copy Job feature in Microsoft Fabric currently requires using temporary storage when copying data from a local SQL Server to a warehouse. This is because the feature relies on staging the data before it can be ingested into the warehouse.
The ability to copy data directly to the warehouse without using temporary storage is on the roadmap. While there's no specific ETA yet, it's something that the internal team is working on.
In the meantime, a common workaround is to first load the data into a cloud Lakehouse and then perform another copy activity to move it to the warehouse. This ensures that the data is staged properly before final ingestion.
You can also look at this topic: Solved: On Prem SQL Server to Fabric Warehouse via Copy Da... - Microsoft Fabric Community
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Check out the October 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
4 | |
3 | |
3 | |
2 | |
1 |
User | Count |
---|---|
6 | |
5 | |
5 | |
4 | |
2 |