Skip to main content
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Regular Visitor

Data Pipeline Copy Data SQL to JSON data type error


I'm getting the same error for a numeric(14,0) datatype as well so this appears to be a bug. Anyone else seeing this?


I'm doing some testing on Data pipelines pulling data from on premise SQL. I'm getting the following error when trying to write data into JSON files. The error appears to an issue around data type. The source table uses a user defined data type of [decimal](25, 10). The error I'm getting is the following, which indicates an issue with Decimal data type. Anyone know if this is a bug or a limitation?

"dataRead": 530,
"dataWritten": 0,
"filesWritten": 0,
"sourcePeakConnections": 1,
"sinkPeakConnections": 1,
"rowsRead": 1,
"rowsCopied": 0,
"copyDuration": 7,
"throughput": 0.265,
"errors": [
"Code": 9011,
"Message": "ErrorCode=UserErrorFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path: 'a50b860a-309b-403c-87ff-fa2f674889c3/22515c8d-19f9-4cdb-8b29-bd74f38f46ee/Files/mess up/dbo.TAT_TIME.json'.,Source=Microsoft.DataTransfer.Common,''Type=System.ArgumentException,Message=Could not determine JSON object type for type Microsoft.DataTransfer.DataTypes.SqlBigDecimal.,Source=Newtonsoft.Json,'",
"EventType": 0,
"Category": 5,
"Data": {},
"MsgId": null,
"ExceptionType": null,
"Source": null,
"StackTrace": null,
"InnerEventInfos": []
"usedParallelCopies": 1,
"executionDetails": [
"source": {
"type": "SqlServer"
"sink": {
"type": "Lakehouse"
"status": "Failed",
"start": "4/18/2024, 9:25:59 AM",
"duration": 7,
"usedParallelCopies": 1,
"profile": {
"queue": {
"status": "Completed",
"duration": 0
"transfer": {
"status": "Completed",
"duration": 2,
"details": {
"readingFromSource": {
"type": "SqlServer",
"workingDuration": 0,
"timeToFirstByte": 0
"writingToSink": {
"type": "Lakehouse",
"workingDuration": 0
"detailedDurations": {
"queuingDuration": 0,
"timeToFirstByte": 0,
"transferDuration": 2
"dataConsistencyVerification": {
"VerificationResult": "NotVerified"


Community Support
Community Support

Hi @nashworth1234 


Apologies for the issue you are facing.
At this time, we are reaching out to the internal team to get some help on this.
We will update you once we hear back from them.
Appreciate your patience.



Hi @nashworth1234 


Apologies for the delay in response from my end.

Please reach out to our support team to gain deeper insights and explore potential solutions. It's highly recommended that you reach out to our support team. Their expertise will be invaluable in suggesting the most appropriate approach.

Please go ahead and raise a support ticket to reach our support team:

After creating a Support ticket please provide the ticket number as it would help us to track for more information.


Thank you.


Hi @nashworth1234 


We haven’t heard from you on the last response and was just checking back to see if you've had a chance to submit a support ticket. If you have, a reference to the ticket number would be greatly appreciated. This will allow us to track the progress of your request and ensure you receive the most efficient support possible.



Helpful resources

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.


Fabric Monthly Update - May 2024

Check out the May 2024 Fabric update to learn about new features.

Top Solution Authors