Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.

Reply
NagaRK
Advocate I
Advocate I

Fabric Snowflake Connector issue: SQL compilation error: Cannot unload to an inlined .

Hi Team,

 

I am getting below error in the Data pipeline when I try to connect to Snowflake instance in Datapipeline. Could you please help?

Please find the screenshot below. I see many videos this works without any issues. Why should I use staging creation in Snowflake? Any reference links on this?

 

Error Details:

 

ErrorCode=SnowflakeExportCopyCommandOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Snowflake Export Copy Command operation failed,Source=Microsoft.DataTransfer.Connectors.Snowflake,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to execute the query command during read operation.,Source=Microsoft.DataTransfer.Connectors.GenericAdoNet,''Type=Apache.Arrow.Adbc.C.CAdbcDriverImporter+ImportedAdbcException,Message=[Snowflake] 001081 (42000): SQL compilation error:

Cannot unload to an inlined external location. Please create a stage first and unload to the stage instead.,Source=Apache.Arrow.Adbc,'

 

 

NagaRK_0-1747668831860.jpeg

 

 

 

Thanks

 

1 ACCEPTED SOLUTION
NagaRK
Advocate I
Advocate I

Hi @v-sathmakuri  @smeetsh  . I have found the solution for this issue.  Basically in my snowflake account the flag"PREVENT_UNLOAD_To_INLINE_URL " as set true. Which mean we cannot copy the data to the external location directly due to security reasons.  So we need to create a storage integration in snowflake and provide this storage integration name in the copy activity parameter of Fabric then only the copy activity will be successfull. Thank you all for your support.

 

we need to follow option1 mentioned in this article:

https://docs.snowflake.com/en/user-guide/data-load-azure-config

 

 

View solution in original post

6 REPLIES 6
ares2
New Member

Hi @NagaRK, thanks again for your response.

Yes, we had already did that, but it was still failing. The only way to solve it was removing the flag PREVENT_UNLOAD_TO_INLINE_URL. After removing this flag, everything started working correctly. It seems the issue was that Fabric uses a URL in the SQL query that points to Azure Storage, and this flag was preventing that behavior.

Hopefully this helps others facing a similar issue. Thanks again for your support!

ares2
New Member

Hi everyone, I'm working on the same task and despite having configured the storage integration, tested it on Snowflake side and tested all connections on Fabric side, I keep seeing the same error. Below I'm attaching the output of the "copy data" task, which keeps failing.  Can please anyone help me with this issue? Thanks in advance.

 

{
"copyDuration": 9,
"errors": [
{
"Code": 25554,
"Message": "ErrorCode=SnowflakeExportCopyCommandOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Snowflake Export Copy Command operation failed,Source=Microsoft.DataTransfer.Connectors.Snowflake,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to execute the query command during read operation.,Source=Microsoft.DataTransfer.Connectors.GenericAdoNet,''Type=Apache.Arrow.Adbc.C.CAdbcDriverImporter+ImportedAdbcException,Message=[Snowflake] 001081 (42000): SQL compilation error:\nCannot unload to an inlined external location. Please create a stage first and unload to the stage instead.,Source=Apache.Arrow.Adbc,'",
"EventType": 0,
"Category": 5,
"Data": {},
"MsgId": null,
"ExceptionType": null,
"Source": null,
"StackTrace": null,
"InnerEventInfos": []
}
],
"usedDataIntegrationUnits": 4,
"usedParallelCopies": 1,
"executionDetails": [
{
"source": {
"type": "SnowflakeV2"
},
"sink": {
"type": "AzureBlobStorage",
"region": "North Central US"
},
"status": "Failed",
"start": "10/29/2025, 11:53:55 AM",
"duration": 9,
"usedDataIntegrationUnits": 4,
"usedParallelCopies": 1,
"profile": {
"queue": {
"status": "Completed",
"duration": 5
},
"transfer": {
"status": "Completed",
"duration": 3
}
},
"detailedDurations": {
"queuingDuration": 5,
"transferDuration": 3
}
},
{
"source": {
"type": "AzureBlobStorage",
"region": "North Central US"
},
"sink": {
"type": "DataWarehouse"
},
"status": "Canceled",
"usedDataIntegrationUnits": 4,
"usedParallelCopies": 1,
"profile": {},
"detailedDurations": {}
}
],
"dataConsistencyVerification": {
"VerificationResult": "Unsupported"
}
}

 

Hi @ares2 

We need to follow option1 mentioned in this article. Are you able to select and set the storage integration that was created in snowflake in the Copy activity options?  

https://docs.snowflake.com/en/user-guide/data-load-azure-config

 

You should set the storage integration in the advanced tab as shown in the below screnshot.  Also you need to use the staging of blob storage in settings tab.

NagaRK_0-1761750033090.png

 

Thanks,

Rajesh.

NagaRK
Advocate I
Advocate I

Hi @v-sathmakuri  @smeetsh  . I have found the solution for this issue.  Basically in my snowflake account the flag"PREVENT_UNLOAD_To_INLINE_URL " as set true. Which mean we cannot copy the data to the external location directly due to security reasons.  So we need to create a storage integration in snowflake and provide this storage integration name in the copy activity parameter of Fabric then only the copy activity will be successfull. Thank you all for your support.

 

we need to follow option1 mentioned in this article:

https://docs.snowflake.com/en/user-guide/data-load-azure-config

 

 

smeetsh
Responsive Resident
Responsive Resident

Create a separate connector of type Snowflake first. Next use that connector in the copy data activity. You should be able to read the whole table, but you could also use a simple select * from tablename if you want.

 

The party that supplied you with the snowflake access may force you to log in to the webportal first to change your password

v-sathmakuri
Community Support
Community Support

Hi  @NagaRK , 

 

Thank you for reaching out to Microsoft Fabric Community.

 

I have established a connection to Snowflake. Instead of selecting the table to fetch the data, try using a query to retrieve the data from Snowflake. PFBS for reference. Let us know if you still encounter any issues.

 

 vsathmakuri_1-1747837448118.png

 

vsathmakuri_0-1747837439872.png

 

If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it! 

 

Thank you!!

 

 

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors