Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowFabric Data Days Monthly is back. Join us on March 26th for two expert-led sessions on 1) Getting Started with Fabric IQ and 2) Mapping & Spacial Analytics in Fabric. Register now
Hello,
I query an API using the Copy Activity in Data Pipelines. The JSON returned has the following schema -
I struggle to get the mapping right to import all rows to a table in a Fabric Datawarehouse:
{
"datasource": "25b10fba",
"columns": [
"ID",
"Name",
"WQSD ID",
"Year",
],
"metadata": [
{
"type": "LONG",
"dataSourceId": "25b10fba",
"maxLength": -1,
"minLength": -1,
"periodIndex": 0,
"aggregated": false
},
{
"type": "STRING",
"dataSourceId": "25b10fba",
"maxLength": -1,
"minLength": -1,
"periodIndex": 0,
"aggregated": false
},
{
"type": "LONG",
"dataSourceId": "25b10fba",
"maxLength": -1,
"minLength": -1,
"periodIndex": 0,
"aggregated": false
},
{
"type": "STRING",
"dataSourceId": "25b10fba",
"maxLength": -1,
"minLength": -1,
"periodIndex": 0,
"aggregated": false
}
],
"rows": [
[
17432,
"XXX",
123,
"2023"
],
[
17433,
"YYY",
456,
"2022"
]
...
],
"numRows": 5538,
"numColumns": 4,
"fromcache": false,
"ADFWebActivityResponseHeaders": {
"Transfer-Encoding": "chunked",
"Connection": "keep-alive",
"Vary": "Origin;Access-Control-Request-Method;Access-Control-Request-Headers;Accept-Encoding;User-Agent",
"X-Content-Type-Options": "nosniff",
"X-XSS-Protection": "1; mode=block",
"Pragma": "no-cache",
"X-Frame-Options": "DENY",
"Strict-Transport-Security": "max-age=31536000; includeSubDomains",
"Cache-Control": "no-store, must-revalidate, no-cache, max-age=0",
"Date": "Thu, 06 Jul 2023 14:35:02 GMT",
"Content-Type": "application/json; charset=utf-8",
"Expires": "0"
},
"executionDuration": 3
}
I do not need the COLUMNS or METADATA part - I would just like to import all the columns from the ROWS section into a table.
How do I have to set up the mapping in the Copy Activity to achieve this!
Thanks for any advice!
Solved! Go to Solution.
Can you check whether using explicit mapping like below can work, details refer to Schema and data type mapping in copy activity - Azure Data Factory & Azure Synapse | Microsoft Learn?
Hi @chris__1 ,
Did you get this to work? Im having the same issue.
Br
Marius 🙂
The Advanced Editor and Collection Reference features do not exist in Microsoft Fabric Data Factory. They do exist in Azure Data Factory. I'm running into the same problem where essentially my json data cannot be mapped properly because the activity is not able to recognize the array data type in Fabric Data Factory like it can in Azure Data Factory.
Can you check whether using explicit mapping like below can work, details refer to Schema and data type mapping in copy activity - Azure Data Factory & Azure Synapse | Microsoft Learn?
Thanks so much and thanks for the link to the documentation!
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 4 | |
| 2 | |
| 1 | |
| 1 | |
| 1 |