Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
WishAskedSooner
Continued Contributor
Continued Contributor

Dataflow Gen2 Connections

Do I understand this correctly? Dataflow Gen2 (PowerQuery) has the ability to connect to over 300 different sources. Yet, despite this massive list, Data Pipeline activity output/parameters/variables/etc. are not one of them????

 

Microsoft brags about how they brought Azure Data Factory and Power Query together into Fabric. I fell for the hype and signed up for a premium license, believing that I could finally port my complex PowerQueries into an end-to-end solution with ETL management.

 

Only, now, to my dismay, I am discovering that Microsoft simply shoved PQ into Fabric in a ham-fisted manner with no ability to actually configure a pipeline that relies on metadata. Instead, Notebooks seem to be the only solution to any ETL process, even simple ones.

 

Someone please talk me down from the ledge. Maybe I don't get it, but I am afraid I do.

1 ACCEPTED SOLUTION
mllopis
Community Admin
Community Admin

Thanks for the feedback, and sorry that you're struggling to realize the value or make specific use cases work within Data Factory in Fabric.

 

I am not entirely sure about which one of these two use cases you are aiming for, so let me cover both:

  1. Having parameterized dataflows, which you can orchestrate in a Data Pipeline and pass different Dataflow parameter values at runtime - First-class support for this common usage pattern is something that we're working towards (and tracked as a very popular feature request here). In the meantime, there are ways for you to emulate this behavior with a more complex solution that is based on passing those "runtime parameters" as data values stored in a table that Pipeline can write to (e.g. to set the value) and Dataflow can read from (as dynamic input to your dataflow queries). A similar pattern is covered in this blog: CHANGE (IN THE HOUSE OF LAKES) - It's Not About The Cell (itsnotaboutthecell.com)
  2. Being able to transform data ingested via a Data Pipeline using a Dataflow: This should be possible today as long as you pick a data sink in your Data Pipeline that can be leveraged as a data source in the dataflow. Furthermore, you can within the same Data Pipeline orchestrate the execution of your dataflow when the Data Pipeline activity that ingests data into that location completes. This article provides further details on the latter: Use a dataflow in a pipeline - Microsoft Fabric | Microsoft Learn

Thank you again for your feedback. Please let us know if this information helped or you have further questions.

 

Regards,
M.

View solution in original post

1 REPLY 1
mllopis
Community Admin
Community Admin

Thanks for the feedback, and sorry that you're struggling to realize the value or make specific use cases work within Data Factory in Fabric.

 

I am not entirely sure about which one of these two use cases you are aiming for, so let me cover both:

  1. Having parameterized dataflows, which you can orchestrate in a Data Pipeline and pass different Dataflow parameter values at runtime - First-class support for this common usage pattern is something that we're working towards (and tracked as a very popular feature request here). In the meantime, there are ways for you to emulate this behavior with a more complex solution that is based on passing those "runtime parameters" as data values stored in a table that Pipeline can write to (e.g. to set the value) and Dataflow can read from (as dynamic input to your dataflow queries). A similar pattern is covered in this blog: CHANGE (IN THE HOUSE OF LAKES) - It's Not About The Cell (itsnotaboutthecell.com)
  2. Being able to transform data ingested via a Data Pipeline using a Dataflow: This should be possible today as long as you pick a data sink in your Data Pipeline that can be leveraged as a data source in the dataflow. Furthermore, you can within the same Data Pipeline orchestrate the execution of your dataflow when the Data Pipeline activity that ingests data into that location completes. This article provides further details on the latter: Use a dataflow in a pipeline - Microsoft Fabric | Microsoft Learn

Thank you again for your feedback. Please let us know if this information helped or you have further questions.

 

Regards,
M.

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors