Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the OneLake & Platform Admin teams for an ask US anything on July 16th. Join now.

Reply
KylieF
Frequent Visitor

Copy a Table in SQL Server into Parquet Format in Blob Storage

Hi there,

 

I've built a data pipeline copying a table in SQL Server into a parquet file in Azure Blob Storage and got the below error message:

 

ErrorCode=UserErrorInvalidValueInPayload,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to convert the value in 'compressionCodec' property to 'Microsoft.DataTransfer.Richfile.Core.ParquetCompressionCodec' type. Please make sure the payload structure and value are correct.,Source=Microsoft.DataTransfer.DataContracts,''Type=System.InvalidCastException,Message=Null object cannot be converted to a value type.,Source=mscorlib,'

 

How to fix it please?

 

Cheers,

Kylie

1 ACCEPTED SOLUTION
govindarajan_d
Super User
Super User

Hi @KylieF,

 

We would need to some debugging to understand which values are actually causing the problem. 

Please enable fault tolerance with 'Skip incompatible rows'. This will load all correct data into the table. 

Enable logging so that a log is created which contains rows that are causing the problem. You can check individual columns to understand any special character/value that is causing the problem. 

govindarajan_d_0-1740032439609.png

 

In addition to that, you can try mapping manually giving the column names and data types. Note that, parquet has some restrictions on column names. 

 

govindarajan_d_1-1740032571294.png

 

 

View solution in original post

2 REPLIES 2
Sureshvarma
Frequent Visitor

Hi @KylieF 

It looks like your data pipeline is trying to write a Parquet file to Azure Blob Storage, but the compressionCodec property is either missing or set to an invalid value. Here’s how you can fix it:

Possible Causes and Fixes:

  1. Set a Valid Compression Codec
    Ensure that the compressionCodec property is set correctly. The supported values for Parquet compression in Azure Data Factory are:

    • None
    • Gzip
    • Snappy
    • LZO

    Example JSON snippet for the dataset configuration:

    json
    Copy code
    "format": { "type": "ParquetFormat", "compressionCodec": "Snappy" }
  2. Check for Null or Missing Property
    The error suggests that compressionCodec might be missing or set to null. If you haven't explicitly defined it, try setting it to "None" instead of leaving it blank.

  3. Validate JSON Payload in the Dataset Definition
    If you are using Azure Data Factory (ADF) or Synapse Pipelines, go to the Sink dataset (Parquet file) settings and verify that the compression setting is defined.

  4. Check Your Pipeline Parameters
    If compressionCodec is set dynamically using pipeline parameters, ensure the parameter value is correctly assigned before execution.

  5. Restart the Pipeline After Fixing
    Once you've corrected the issue, publish the changes and restart the pipeline.

 

 
govindarajan_d
Super User
Super User

Hi @KylieF,

 

We would need to some debugging to understand which values are actually causing the problem. 

Please enable fault tolerance with 'Skip incompatible rows'. This will load all correct data into the table. 

Enable logging so that a log is created which contains rows that are causing the problem. You can check individual columns to understand any special character/value that is causing the problem. 

govindarajan_d_0-1740032439609.png

 

In addition to that, you can try mapping manually giving the column names and data types. Note that, parquet has some restrictions on column names. 

 

govindarajan_d_1-1740032571294.png

 

 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.