Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
KBD
Helper II
Helper II

pipeline Copy Activity JSON file with multiple documents - Mapping

Hello All:

 

I am making no progress on this issue so I will put out to you good folks.

We are in this situation because Data Factory has issues with attributes that start with special characters.  In this case @ (at sign)

My data comes in via a REST API and look like this:  {"@odata.context":"https://fifa.com/goalsAssist","value":[ { .......

 

sample file here:

{"@odata.context":"https://fifa.com/goalsAssist","value":[  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202201,    "goals": 1,    "assists": 0,    "minutes played": 20,    "games played": 1  },  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202202,    "goals": 0,    "assists": 0,    "minutes played": 20,    "games played": 1  },  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202203,    "goals": 0,    "assists": 0,    "minutes played": 20,    "games played": 1  },  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202204,    "goals": 2,    "assists": 0,    "minutes played": 45,    "games played": 1  },  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202205,    "goals": 0,    "assists": 0,    "minutes played": 45,    "games played": 2  },  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202206,    "goals": 3,    "assists": 1,    "minutes played": 90,    "games played": 3  },  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202207,    "goals": 1,    "assists": 1,    "minutes played": 30,    "games played": 1  },  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202208,    "goals": 4,    "assists": 2,    "minutes played": 110,    "games played": 3  },  {    "team": "BLUE",    "player": "ROBIN",    "yyyymm": 202209,    "goals": 6,    "assists": 2,    "minutes played": 110,    "games played": 3  } ] }
{"@odata.context":"https://fifa.com/goalsAssist","value":[  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202201,    "goals": 2,    "assists": 1,    "minutes played": 70,    "games played": 2  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202201,    "goals": 0,    "assists": 2,    "minutes played": 45,    "games played": 1  },  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202202,    "goals": 1,    "assists": 0,    "minutes played": 60,    "games played": 2  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202202,    "goals": 0,    "assists": 3,    "minutes played": 45,    "games played": 1  },  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202203,    "goals": 0,    "assists": 0,    "minutes played": 50,    "games played": 2  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202203,    "goals": 0,    "assists": 1,    "minutes played": 90,    "games played": 1  },  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202204,    "goals": 3,    "assists": 2,    "minutes played": 40,    "games played": 3  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202204,    "goals": 0,    "assists": 2,    "minutes played": 90,    "games played": 2  },  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202205,    "goals": 2,    "assists": 0,    "minutes played": 40,    "games played": 3  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202205,    "goals": 1,    "assists": 3,    "minutes played": 120,    "games played": 3  },  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202206,    "goals": 1,    "assists": 3,    "minutes played": 40,    "games played": 3  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202206,    "goals": 0,    "assists": 0,    "minutes played": 45,    "games played": 1  },  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202207,    "goals": 0,    "assists": 1,    "minutes played": 10,    "games played": 1  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202207,    "goals": 0,    "assists": 0,    "minutes played": 60,    "games played": 2  },  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202208,    "goals": 4,    "assists": 4,    "minutes played": 90,    "games played": 3  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202208,    "goals": 0,    "assists": 1,    "minutes played": 90,    "games played": 2  },  {    "team": "GREEN",    "player": "BOBBY",    "yyyymm": 202209,    "goals": 3,    "assists": 6,    "minutes played": 75,    "games played": 2  },  {    "team": "GREEN",    "player": "PATTSY",    "yyyymm": 202209,    "goals": 1,    "assists": 3,    "minutes played": 120,    "games played": 2  } ]  }
{"@odata.context":"https://fifa.com/goalsAssist","value":[]}

 

The Data Factory activities choke on the @odata.context

Please note there are multiple JSON docs in the file and there are mutliple JSON objects per line in the file.

I have been able to handle the  multiple JSON docs in the file and the mutliple JSON objects per file in

another Pipeline with the Flatten tool.

My thought is to use Mapping on the Copy activity to ignore the @odata.

I have tried many mappings with not luck .

 

KBD_0-1752869484648.png

 

This is one attempt. @odata.content is NOT included.   The rest of the data,  I want move over as is.

 

KD

 

1 ACCEPTED SOLUTION
justinjbird
Frequent Visitor

Wasn't clear from your description where you intend to send the data or what your specific issue was. But I attempted this using the example file and send it to a lakehouse table.

 

You have to set set up a collection reference which is the array you want to iterate over. In your structure that would be `$['value']` from there you map the columns relative to the collection path. Screenshot of my mapping...

 

justinjbird_0-1754135500893.png

 

27 rows copied from the example file...

justinjbird_1-1754135588202.png

 

If you are trying to send it somewhere else should be similar principle. Note in the example I also added an additional column `runDTS` for debug.

 

 

 

View solution in original post

8 REPLIES 8
justinjbird
Frequent Visitor

Wasn't clear from your description where you intend to send the data or what your specific issue was. But I attempted this using the example file and send it to a lakehouse table.

 

You have to set set up a collection reference which is the array you want to iterate over. In your structure that would be `$['value']` from there you map the columns relative to the collection path. Screenshot of my mapping...

 

justinjbird_0-1754135500893.png

 

27 rows copied from the example file...

justinjbird_1-1754135588202.png

 

If you are trying to send it somewhere else should be similar principle. Note in the example I also added an additional column `runDTS` for debug.

 

 

 

Thanks for your kind attention to this matter.

That is what I came up with also.

In short only work with $value array , 

make no reference to  @odata.context

 

Took a while to figure out the correct config.

 

Thanks again for your suggestion.

 

KD

 

v-nmadadi-msft
Community Support
Community Support

Hi @KBD 

As we haven’t heard back from you, we wanted to kindly follow up to check if the suggestions  provided by the community members for the issue worked. Please feel free to contact us if you have any further questions.

 

Thanks and regards

v-nmadadi-msft
Community Support
Community Support

Hi @KBD 

May I check if this issue has been resolved? If not, Please feel free to contact us if you have any further questions.


Thank you

v-nmadadi-msft
Community Support
Community Support

Hi @KBD 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions.


Thank you.

BhaveshPatel
Community Champion
Community Champion

Then Write some code in python notebooks and rest UI/UX in Data Factory. This is how I should do:

Python --> Spark Dataframe --> Convert to Delta Table ( This part programmimg code ) and once delta table is ready, write copy activity in Fabric Data Factory. 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.
BhaveshPatel
Community Champion
Community Champion

Rather than Data Factory, I should start with Python Notebooks if you know how to do ETL REST APIs in Python. it is just JSON file.

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

I already have this working in Python with no issues.

But Corporate,  the powers that be, want a Data Factory solution because it is " No code"

 

KD

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.