Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi Community,
I need to copy data from a fabric lakehouse to snowflake database. What are the different ways I can dump the data?
Thank You!
Solved! Go to Solution.
hello @rakeshdasari
there are multiple ways to do this.
each way has different tradeoff
Direct pipelines are faster but require staging; external tables avoid copying but require upfront permissions.
OneLake External Tables
Expose Fabric Iceberg tables directly in Snowflake via external volumes for real-time querying without physical data copying.
https://www.snowflake.com/en/blog/microsoft-partnership-enhancing-interoperability
Azure Blob Intermediate
Stage data in Azure Blob Storage first for large/complex datasets, then use Snowflake’s `COPY INTO` to load from Blob (ideal for batch transfers).
Pipeline Copy Activity
Use Fabric’s built-in connector to directly map Lakehouse data to Snowflake via the Copy Data Assistant, requiring manual staging configuration (Azure Blob) for optimal performance.
if this is helpful please accept the answer
hello @rakeshdasari
there are multiple ways to do this.
each way has different tradeoff
Direct pipelines are faster but require staging; external tables avoid copying but require upfront permissions.
OneLake External Tables
Expose Fabric Iceberg tables directly in Snowflake via external volumes for real-time querying without physical data copying.
https://www.snowflake.com/en/blog/microsoft-partnership-enhancing-interoperability
Azure Blob Intermediate
Stage data in Azure Blob Storage first for large/complex datasets, then use Snowflake’s `COPY INTO` to load from Blob (ideal for batch transfers).
Pipeline Copy Activity
Use Fabric’s built-in connector to directly map Lakehouse data to Snowflake via the Copy Data Assistant, requiring manual staging configuration (Azure Blob) for optimal performance.
if this is helpful please accept the answer
Hii @nilendraFabric , while connecting to snowflake using copydata activity, I am getting an Authentication issue (Invalid credentials) inspite of providing the valid credentials. Do you know why is this happening?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.