Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
DebbieE
Community Champion
Community Champion

Parameterising a Json to SQL DW Mappingundefined

I have JSON files in a folder.

I am delta loading them into a SQL Data Warehouse dependant upon if they have been processed or now.

At the start I have a lookup that triggers a Stored Procedure usp_GET_PIPELINE_PARAMETERS which gets 1 row out or PIPELINE_PARAMETERS table based on the PipelineName

So I have sqlTable sqlSchema container, sourceFolder, file etc. And it picks the correct parametes to add to my pipeline. E.g. 

@activity('LookupGetParameters').output.firstRow.rootFolder into my Get Metadata activity
 
this all works well untill we get to the mapping.  i have to have a mapping because of the slight complexity of the schema. It has multiple ids for example within different arrays. in and out is also in multiple times within different arrays. So you have to create a mapping.
 
I added a mapping  parameter and copied the working mapping from the original files JSON. But its not working. This is in SQL  (Example, I have changed the names)
 
'{"type": "TabularTranslator", "mappings":
[{ "source": { "path": "[''id'']" }, "sink": { "name": "coach_id", "type": "String" } },
{ "source": { "path": "[''load'']" }, "sink": { "name": "coach_load", "type": "Int32" } },
{ "source": { "path": "[''staff''][''roletype'']" }, "sink": { "name": "staff_roletype", "type": "Int32" } },
{ "source": { "path": "[''staff''][''departments'']" }, "sink": { "name": "staff_departments", "type": "Int32" } },
{ "source": { "path": "[''staffleaving''][''roletype'']" }, "sink": { "name": "staffleaving_roletype", "type": "Int32" } },
{ "source": { "path": "[''staffleaving''][''departments'']" }, "sink": { "name": "staffleaving_departments", "type": "Int32" } },
{ "source": { "path": "$[''created'']" }, "sink": { "name": "created", "type": "DateTime" } },
{ "source": { "path": "$[''#calibrated'']" }, "sink": { "name": "calibrated", "type": "String" } },
{ "source": { "path": "$[''meta''][''headcode'']" }, "sink": { "name": "headcode", "type": "String" } },
{ "source": { "path": "$[''meta''][''station''][''id'']" }, "sink": { "name": "station_id", "type": "String" } },
{ "source": { "path": "$[''meta''][''station''][''lon'']" }, "sink": { "name": "lon", "type": "Single" } },
{ "source": { "path": "$[''meta''][''station''][''lat'']" }, "sink": { "name": "lat", "type": "Single" } },
{ "source": { "path": "$[''counts''][''load'']" }, "sink": { "name": "count_load", "type": "Int32" } },
{ "source": { "path": "$[''counts''][''staff'']" }, "sink": { "name": "staff", "type": "Int32" } },
{ "source": { "path": "$[''counts''][''staffleaving'']" }, "sink": { "name": "staffleaving", "type": "Int32" } },
{ "source": { "path": "$[''staff_id'']" }, "sink": { "name": "staff_id", "type": "String" } },
{ "source": { "path": "$[''ImportDate'']" }, "sink": { "name": "ImportDate", "type": "DateTime" } },
{ "source": { "path": "$[''FileName'']" }, "sink": { "name": "FileName" } } ],
"collectionReference": "$[''departments'']", "mapComplexValuesToString": true } } }'
 
And in the copy data activity I have mapping: 
@activity('LookupGetParameters').output.firstRow.mapping
 
I have this going into a set variable so I can see the output and it matches the SQL (Obviously '' is now ')
 
I run the pipeline and I just get the meta data in my end SQL Table. No data. So 2 files are in the load and I get 2 rows with no data.
 
I am really struggling with creating a way of parameterising the mapping. Can anyone help on this?
 
When I look at the output of the Set Variable it contains lots of \. E.g. [{\"source\":{\"path\":\"['id']\"},\"sink\": So I think that is what causing the issue. But I have tried doing the following 
@string(json(activity('LookupGetParameters').output.firstRow.mapping))
 
and 
 
@replace(string(json(activity('LookupGetParameters').output.firstRow.mapping)),'/','')
 
But this hasnt seemed to work either. 
 
1 ACCEPTED SOLUTION

All sorted quite quickly thanks. i just needed this in the copy activity in mapping

@json(activity('LookupGetParameters').output.firstRow.mapping)
 
I had to wrap the json around it to make it work properly as part of the code

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Hi @DebbieE 

 

I found a blog about dynamic mapping in Azure Data Factory. You may try if the solution can be adjusted to be applied to the data pipeline in Fabric. It might be helpful. 

Dynamically Map JSON to SQL in Azure Data Factory | Under the kover of business intelligence (sqlkov...

 

Best Regards,
Jing
Community Support Team

All sorted quite quickly thanks. i just needed this in the copy activity in mapping

@json(activity('LookupGetParameters').output.firstRow.mapping)
 
I had to wrap the json around it to make it work properly as part of the code

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.