Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Ask the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.

Reply
shivangjha
Regular Visitor

Using HierarchicalTranslator in Fabric copy activity

Hi
I am using Fabric copy activity to write some data to MongoDB Atlas collection in a Fabric Data pipeline.

I wanted to specify HierarchicalTranslator because i want to create hierarchical data in mongo from a tabular source.
The source of data is a fabric lakehouse table.

So, if the lakehouse table, table1 has columns- a, b, c, d - 
i want to store these in mongo as 
{

'a': 'value of a',

'parameters': {

                     'b': "val_b'

                     'c': "val_c'

                     'd': "val_d'

                    }
}


What i have tried so far:
1. Mapping - 
mapping type was Tabular, but sink setting for the mapping was liek below:

b -> 'parameters.b'

...so on

 

but it resulted in persistance of

{

'parameters.b': "val_b"

}

in mongo collection. 

 

2. Tried editing the pipeline json - and put mapping to HierarchicalTranslator - but it gets silently reverted to Tabular and the result is the same.

 

Is there any other way?

Please let me know in case i need to give more info.

 

Thanks in advance!

1 ACCEPTED SOLUTION
shivangjha
Regular Visitor

Fixed the issue partially by writing the documents as Json Files in the lakehouse and then using copy activity.
The hierarchy of the documents is maintained in mongoDB.

View solution in original post

6 REPLIES 6
shivangjha
Regular Visitor

Fixed the issue partially by writing the documents as Json Files in the lakehouse and then using copy activity.
The hierarchy of the documents is maintained in mongoDB.

Vinodh247
Resolver IV
Resolver IV

fabric data pipelines do not currently support HierarchicalTranslator mappings for mongodb sinks as of now only tabular mappings are supported. The reason is when you try to set the translator to HierarchicalTranslator, it silently reverts to TabularTranslatorMongoDB ends up storing keys like 'parameters.b' instead of nesting the fields.  In the background fabric copy activity currently does not interpret dot notation ('parameters.b') in mappings as hierarchy when writing to MongoDB. It simply writes the key as is but not as a nested object.

 

 

With the current limitations the best workarounds that you can try is to ...

  • use a fabric notebook to transform and write to MongoDB using pymongo

  • for prod use cases requiring GUI based ETL with hierarchical outputs, consider ADF or synapse pipelines

 


Please 'Kudos' and 'Accept as Solution' if this answered your query.

Thanks Vinodh for the reply.

Re - 

  • "use a fabric notebook to transform and write to MongoDB using pymongo"

This has another problem as Fabric notebook IPs are non-deterministic so i can't whitelist the mongo clusters specifically for the notebooks unless i resort to whitelisting the mongo cluster for all ip ranges, which won't be appropriate.

burakkaragoz
Community Champion
Community Champion

Hi @shivangjha ,

 

You're hitting a common limitation with Fabric copy activities when trying to create nested JSON structures. The dot notation approach you tried creates literal field names instead of hierarchical structures.

Here's what actually works:

Option 1: Use Data Flow instead Copy activity isn't great for complex JSON transformations. Switch to Data Flow:

  • Add your lakehouse table as source
  • Use Derived Column transformation:
     
    parameters = @(b=b, c=c, d=d)
  • Select only a and parameters columns
  • Send to MongoDB sink

Option 2: Pre-build the structure in your lakehouse Create a view or use SQL to structure your data first:

 
 
SELECT 
    a,
    STRUCT(b as b, c as c, d as d) as parameters
FROM table1

Option 3: JSON string approach (last resort) If you must stick with copy activity, create the nested object as a JSON string:

parameters = concat('{"b":"', b, '","c":"', c, '","d":"', d, '"}')

The HierarchicalTranslator reverting to Tabular is expected behavior - it's not fully supported for MongoDB sinks in Fabric yet.

Data Flow is your best bet for reliable hierarchical JSON creation. Have you tried that approach yet?


If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.

Update -
DataFlows Do not work. Currently, the output destinations are limited. MongoDB as sink is not supported for fabric dataflow activities.

Thanks for the response @burakkaragoz 
Will try Data Flow for this.

Re Pre-building json - 
"Option 2: Pre-build the structure in your lakehouse Create a view or use SQL to structure your data first:"

I tried this transformation in a notebook and saving that as delta table, which then supposedly could be referred in the data copy activity. The data copy activity complains about STRUCT data type.
  • Fabric Copy Activity does not support STRUCT types in Delta
Will try the Data Flow approach.
Thanks

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.