Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
rakeshdasari
New Member

Copy Data from fabric Lakehouse to snowflake

Hi Community,
     I need to copy data from a fabric lakehouse to snowflake database. What are the different ways I can dump the data?

Thank You!

1 ACCEPTED SOLUTION
nilendraFabric
Super User
Super User

hello @rakeshdasari 

 

there are multiple ways to do this.

each way has different tradeoff

 

Direct pipelines are faster but require staging; external tables avoid copying but require upfront permissions.

 

 


OneLake External Tables
Expose Fabric Iceberg tables directly in Snowflake via external volumes for real-time querying without physical data copying.

 

https://www.snowflake.com/en/blog/microsoft-partnership-enhancing-interoperability

 

 

Azure Blob Intermediate
Stage data in Azure Blob Storage first for large/complex datasets, then use Snowflake’s `COPY INTO` to load from Blob (ideal for batch transfers).

 


Pipeline Copy Activity
Use Fabric’s built-in connector to directly map Lakehouse data to Snowflake via the Copy Data Assistant, requiring manual staging configuration (Azure Blob) for optimal performance.

 

if this is helpful please accept the answer 

 

View solution in original post

2 REPLIES 2
nilendraFabric
Super User
Super User

hello @rakeshdasari 

 

there are multiple ways to do this.

each way has different tradeoff

 

Direct pipelines are faster but require staging; external tables avoid copying but require upfront permissions.

 

 


OneLake External Tables
Expose Fabric Iceberg tables directly in Snowflake via external volumes for real-time querying without physical data copying.

 

https://www.snowflake.com/en/blog/microsoft-partnership-enhancing-interoperability

 

 

Azure Blob Intermediate
Stage data in Azure Blob Storage first for large/complex datasets, then use Snowflake’s `COPY INTO` to load from Blob (ideal for batch transfers).

 


Pipeline Copy Activity
Use Fabric’s built-in connector to directly map Lakehouse data to Snowflake via the Copy Data Assistant, requiring manual staging configuration (Azure Blob) for optimal performance.

 

if this is helpful please accept the answer 

 

Hii @nilendraFabric , while connecting to snowflake using copydata activity, I am getting an Authentication issue (Invalid credentials) inspite of providing the valid credentials. Do you know why is this happening?  

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

Top Solution Authors