Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi all,
I am trying to load a parquet file from the Lakehouse into a Datawarehouse table.
I can't find my way around this error:
ErrorCode=UnsupportedPhysicalTypeOfParquetBigDecimal,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ParquetBigDecimal cannot be written as Parquet physical type of ByteArray.,Source=Microsoft.DataTransfer.Richfile.ParquetTransferPlugin,'
I've seen some postings but I can't find the way to solve it.
Can anyone help?
Thanks,
Martin
Solved! Go to Solution.
Unfortunately I wasn't able to solve the issue of reading this file.
We end up requesting a new file.
Probably the solution would be to read it somewhere outside Fabric, then transform it and bring it into the Lake.
For now, I am not working on the solution any longer.
Hi mrojze,
Thank you for your update and for sharing your insights and approach in resolving the issue.
Please continue to utilize the Fabric Community for any further assistance with your queries.
Thank you.
Hi mrojze,
We are following up to see if what we shared solved your issue. If you need more support, please reach out to the Microsoft Fabric community.
Thank you.
Hi mrojze,
We would like to follow up and see whether the details we shared have resolved your problem.
If you need any more assistance, please feel free to connect with the Microsoft Fabric community.
Thank you.
Unfortunately I wasn't able to solve the issue of reading this file.
We end up requesting a new file.
Probably the solution would be to read it somewhere outside Fabric, then transform it and bring it into the Lake.
For now, I am not working on the solution any longer.
Hi mrojze,
Thank you for the update.
From what I understand, the error "[DECIMAL_PRECISION_EXCEEDS_MAX_PRECISION] Decimal precision 39 exceeds max precision 38" means that the source Parquet file has a decimal field with precision 39, which is more than the allowed maximum of 38 in Spark and Microsoft Fabric Notebooks. Because of this, the file cannot be read by the system, and we cannot apply casting unless the file is read successfully first.
If you have control over the Parquet file source, please change the decimal field to precision 38 or less (for example, DECIMAL(38, x)) and then try reading or transforming the file again using the PySpark notebook.
If you have any other questions, please feel free to ask the Microsoft Fabric community.
Thank you.
Thankyou, @lbendlin, for your response.
Hi mrojze,
We appreciate your query on the Microsoft Fabric Community Forum.
From what I understand, the error occurs because the Parquet file contains Decimal types stored as BYTE_ARRAY, which is currently not supported by Microsoft Fabric’s Copy Activity. This encoding is typically used for BigDecimal in Parquet, but Fabric expects decimals to be stored in supported formats like INT64 or FIXED_LEN_BYTE_ARRAY.
Please follow the workaround steps below where we re-write the Parquet file using PySpark by casting decimal columns to a supported format before ingestion:
This ensures that the decimals are stored in a supported physical format and resolves the error.
We hope this information helps you resolve the issue.
If you have any further questions, please feel free to reach out to the Microsoft Fabric community.
Thank you.
Thanks for the help!
I am still getting an error:
So you are telling me that there is no way to read this file?
No workarounds?
You would need to downgrade your byte arrays to Int64 before ingesting.
Makes sense.
How do you do that in a pipeline or any other Farbic tool?
Unfortunatelly, I can't request a new file.
Power BI has no support for Int96 or Int128.
If this is important to you please consider voting for an existing idea or raising a new one at https://ideas.fabric.microsoft.com
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.