Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateJoin the OneLake & Platform Admin teams for an ask US anything on July 16th. Join now.
Hi Team,
I am getting below error in the Data pipeline when I try to connect to Snowflake instance in Datapipeline. Could you please help?
Please find the screenshot below. I see many videos this works without any issues. Why should I use staging creation in Snowflake? Any reference links on this?
Error Details:
ErrorCode=SnowflakeExportCopyCommandOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Snowflake Export Copy Command operation failed,Source=Microsoft.DataTransfer.Connectors.Snowflake,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to execute the query command during read operation.,Source=Microsoft.DataTransfer.Connectors.GenericAdoNet,''Type=Apache.Arrow.Adbc.C.CAdbcDriverImporter+ImportedAdbcException,Message=[Snowflake] 001081 (42000): SQL compilation error:
Cannot unload to an inlined external location. Please create a stage first and unload to the stage instead.,Source=Apache.Arrow.Adbc,'
Thanks
Solved! Go to Solution.
Hi @v-sathmakuri @smeetsh . I have found the solution for this issue. Basically in my snowflake account the flag"PREVENT_UNLOAD_To_INLINE_URL " as set true. Which mean we cannot copy the data to the external location directly due to security reasons. So we need to create a storage integration in snowflake and provide this storage integration name in the copy activity parameter of Fabric then only the copy activity will be successfull. Thank you all for your support.
we need to follow option1 mentioned in this article:
https://docs.snowflake.com/en/user-guide/data-load-azure-config
Hi @v-sathmakuri @smeetsh . I have found the solution for this issue. Basically in my snowflake account the flag"PREVENT_UNLOAD_To_INLINE_URL " as set true. Which mean we cannot copy the data to the external location directly due to security reasons. So we need to create a storage integration in snowflake and provide this storage integration name in the copy activity parameter of Fabric then only the copy activity will be successfull. Thank you all for your support.
we need to follow option1 mentioned in this article:
https://docs.snowflake.com/en/user-guide/data-load-azure-config
Create a separate connector of type Snowflake first. Next use that connector in the copy data activity. You should be able to read the whole table, but you could also use a simple select * from tablename if you want.
The party that supplied you with the snowflake access may force you to log in to the webportal first to change your password
Hi @NagaRK ,
Thank you for reaching out to Microsoft Fabric Community.
I have established a connection to Snowflake. Instead of selecting the table to fetch the data, try using a query to retrieve the data from Snowflake. PFBS for reference. Let us know if you still encounter any issues.
If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it!
Thank you!!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |