Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.
We are trying to load data from synapse to fabric but getting one issue that is related to float data type that we have in target, during the load it is staging the data into one lake, and not able to insert into float due to conversion issues, orginal error as below:
ErrorCode=DWCopyCommandOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message='DataWarehouse' Copy Command operation failed with error 'Column '' of type 'FLOAT' is not compatible with external data type 'Parquet physical type: FIXED_LEN_BYTE_ARRAY, logical type: DECIMAL(22, 8)', please try with 'DECIMAL(22, 8)'. Underlying data description:
anyone faced any issues related to above, we are using fabric pipeline to load as one time but not able to do it.
Solved! Go to Solution.
Well we have tested this scenerio by creating tabular mapping and directly loading parquet file from data lake into the lakehouse using ADF and then moving that data into warehouse through stored procedure works; without any staging and it works, somehow staging in onelake does not allow us to move the data into warehouse.
This will be an actual solution for the problemetic data types, or someone who is directly looking to copy data to test fabric warehouse/lakehouse performance and test one time migration.
yes try to change the target to decimal(22,8).
or
when you are writing to the stage cast datatype to double.
SELECT CAST(my_float_column AS double) AS my_float_column
You can try either of the option.
DECIMAL(22,8) does not work for us in fabric we have to keep float only, are there any alternatives?
Hi @AnmolGan81 ,
Since your target schema requires FLOAT, the error you’re encountering is likely due to Parquet’s handling of decimal encoding during staging in OneLake. FLOAT values are being written as DECIMAL(22,8) with a FIXED_LEN_BYTE_ARRAY physical type in Parquet, leading to a mismatch during loading.
To resolve this, it’s recommended to cast your source column to DOUBLE before staging. This will ensure the Parquet file uses the DOUBLE physical type, which matches FLOAT in Fabric.
Thanks for your quick response @BalajiL .
Regards,
Yugandhar_CST team.
we are not staging anything here, fabric automatically decides to use staging when loading data from synapse to warehouse, where can we specify it to use double please explain?
Thanks for your response. Based on Microsoft documented behavior, this issue stems from how Fabric pipelines automatically stage data in OneLake using Parquet format when loading from Synapse to a Fabric Warehouse. During this process, FLOAT values are internally converted to DECIMAL(22,8) with a FIXED_LEN_BYTE_ARRAY, physical type, which causes a mismatch when inserting into a target column defined as FlOAT.
Reference:
Troubleshoot the Parquet format connector - Microsoft Fabric | Microsoft Learn
Could you please try the syntax @AntoineW suggested and let me know how it behaves in your setup.
It wont work and most of the values are NULL for us, I think we need to load it from lakehouse to override the step of not using the staging and directly copy the parquet file with tabular mapping and enabling float datatypes.
Hi @AnmolGan81 ,
Thank you for sharing your findings. Loading data directly from the Lakehouse and bypassing the automatic staging step seems like a good approach, especially if tabular mapping preserves the FLOAT data type. Considering the staging limitations, this could serve as an effective workaround.
I look forward to hearing about your results after you try it.
Thank You.
we have an upcoming migration to other enviorment will let you know how it goes.
Could you please let us know the ETA by when you upcoming migration to other enviorment ? This will help us to know when we can follow up. If you encounter any challenges during implementing the provided solution, please inform us and we’ll be glad to assist.
Well we have tested this scenerio by creating tabular mapping and directly loading parquet file from data lake into the lakehouse using ADF and then moving that data into warehouse through stored procedure works; without any staging and it works, somehow staging in onelake does not allow us to move the data into warehouse.
This will be an actual solution for the problemetic data types, or someone who is directly looking to copy data to test fabric warehouse/lakehouse performance and test one time migration.
Thanks for your workaround.
Thank you for the update. I hope the migration to the new environment goes smoothly. When you've had a chance to test everything, please let me know how it works out.
In your source query cast to double and try
SELECT CAST(my_float_column AS double) AS my_float_column
double does not work for us we get incorrect sytanx error when passing it.
Hello,
try this syntax : TRY_CAST(column_name AS DECIMAL(18,2) AS column_name
we tried with DECIMAL(22,8) its not working and at target will have to check if we can keep decimal(22,8)
hi @AnmolGan81
Float is being mapped as Decimal(22,8) in the staging parquet file. This cause the pipeline failure.
1. Can you change the target table schema datatype to Decimal(22,8) or Double and try it.
or
2. Cast datatype in your source query when writing to staging layer
SELECT CAST(my_float_column AS DECIMAL(22,8)) AS my_float_column
User | Count |
---|---|
15 | |
3 | |
3 | |
3 | |
2 |