Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.

Reply
Anagoor
New Member

How to parameterize output from Dataflow

Good morning,

I'm having an issue parametrizing the output of a Dataflow gen2.

 

In particular given some parameters I'm basically choosing the table to create, those tables have in general different schemas but have the same lakehouse as a destination.

 

What I did is basically convert all columns to string so that they are all automatically bound when choosing the destination, and parametrized the output table name.

 

This job always fails. I suppose that it is because of the different schemas.

 

Does anyone have any clue of how to solve this problem?

1 ACCEPTED SOLUTION
AntoineW
Memorable Member
Memorable Member

Hello @Anagoor,

 

It's a bit frustrating, I'm also experiencing that in Dataflow Gen2, the destination table (sink) can’t be parameterized, because it isn’t managed through Power Query M. You can only parameterize the source side.

Your failures come from trying to use one dataflow output for multiple tables with different schemas — schema mismatches will always break the run.

The recommended approaches are:

  • Use a Pipeline with a Notebook or Copy activity for dynamic routing and schema-aware loads, or

  • Create separate Dataflows Gen2 for each target schema.

 

Hope it can help you ! 

Best regards,

Antoine

View solution in original post

2 REPLIES 2
v-kpoloju-msft
Community Support
Community Support

Hi @Anagoor

Thank you for reaching out to the Microsoft Fabric Community Forum. I also appreciate @AntoineW,  for their contributions to this thread. The solution shared by @AntoineW,  is correct and addresses your question. In addition to that, I have included some alternative options below.

You are correct in Dataflow Gen2, the destination table (sink) cannot be parameterized only the source side can. Your failures happen because you are trying to write multiple tables with different schemas from a single dataflow, which always breaks the run.

Alternatives:
• Use a super-schema with all columns and fill missing ones with nulls.
• Stage data as files first and then register them as tables.

Refer these links:
1. https://learn.microsoft.com/en-gb/fabric/data-factory/dataflow-gen2-data-destinations-and-managed-se... 
2. https://learn.microsoft.com/en-gb/fabric/data-factory/dataflow-parameters 
3. https://learn.microsoft.com/en-us/fabric/data-factory/copy-data-activity 

Hope this clears it up. Let us know if you have any doubts regarding this. We will be happy to help.

Thank you for using the Microsoft Fabric Community Forum.

AntoineW
Memorable Member
Memorable Member

Hello @Anagoor,

 

It's a bit frustrating, I'm also experiencing that in Dataflow Gen2, the destination table (sink) can’t be parameterized, because it isn’t managed through Power Query M. You can only parameterize the source side.

Your failures come from trying to use one dataflow output for multiple tables with different schemas — schema mismatches will always break the run.

The recommended approaches are:

  • Use a Pipeline with a Notebook or Copy activity for dynamic routing and schema-aware loads, or

  • Create separate Dataflows Gen2 for each target schema.

 

Hope it can help you ! 

Best regards,

Antoine

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors