Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hello,
I query an API using the Copy Activity in Data Pipelines. The JSON returned has the following schema -
I struggle to get the mapping right to import all rows to a table in a Fabric Datawarehouse:
{
	"datasource": "25b10fba",
	"columns": [
		"ID",
		"Name",
		"WQSD ID",
		"Year",
	],
	"metadata": [
		{
			"type": "LONG",
			"dataSourceId": "25b10fba",
			"maxLength": -1,
			"minLength": -1,
			"periodIndex": 0,
			"aggregated": false
		},
		{
			"type": "STRING",
			"dataSourceId": "25b10fba",
			"maxLength": -1,
			"minLength": -1,
			"periodIndex": 0,
			"aggregated": false
		},
		{
			"type": "LONG",
			"dataSourceId": "25b10fba",
			"maxLength": -1,
			"minLength": -1,
			"periodIndex": 0,
			"aggregated": false
		},
		{
			"type": "STRING",
			"dataSourceId": "25b10fba",
			"maxLength": -1,
			"minLength": -1,
			"periodIndex": 0,
			"aggregated": false
		}
	],
	"rows": [
		[
			17432,
			"XXX",
			123,
			"2023"
		],
		[
			17433,
			"YYY",
			456,
			"2022"
		]
		...
	],
	"numRows": 5538,
	"numColumns": 4,
	"fromcache": false,
	"ADFWebActivityResponseHeaders": {
		"Transfer-Encoding": "chunked",
		"Connection": "keep-alive",
		"Vary": "Origin;Access-Control-Request-Method;Access-Control-Request-Headers;Accept-Encoding;User-Agent",
		"X-Content-Type-Options": "nosniff",
		"X-XSS-Protection": "1; mode=block",
		"Pragma": "no-cache",
		"X-Frame-Options": "DENY",
		"Strict-Transport-Security": "max-age=31536000; includeSubDomains",
		"Cache-Control": "no-store, must-revalidate, no-cache, max-age=0",
		"Date": "Thu, 06 Jul 2023 14:35:02 GMT",
		"Content-Type": "application/json; charset=utf-8",
		"Expires": "0"
	},
	"executionDuration": 3
}
I do not need the COLUMNS or METADATA part - I would just like to import all the columns from the ROWS section into a table.
How do I have to set up the mapping in the Copy Activity to achieve this!
Thanks for any advice!
Solved! Go to Solution.
Can you check whether using explicit mapping like below can work, details refer to Schema and data type mapping in copy activity - Azure Data Factory & Azure Synapse | Microsoft Learn?
Hi @chris__1 ,
Did you get this to work? Im having the same issue.
Br
Marius 🙂
The Advanced Editor and Collection Reference features do not exist in Microsoft Fabric Data Factory. They do exist in Azure Data Factory. I'm running into the same problem where essentially my json data cannot be mapped properly because the activity is not able to recognize the array data type in Fabric Data Factory like it can in Azure Data Factory.
Can you check whether using explicit mapping like below can work, details refer to Schema and data type mapping in copy activity - Azure Data Factory & Azure Synapse | Microsoft Learn?
Thanks so much and thanks for the link to the documentation!
 
					
				
				
			
		
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.
