Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi everyone,
I’ve built a Dataflow Gen2 pipeline in Fabric that connects to a Salesforce report using report id as the source and writes the data to a SharePoint folder as the destination.
Currently, the destination in our mashup.pq file is static — it points to a fixed SharePoint site, folder, and file name. I am trying to make it dynamic by parameterizing these values (e.g., sharepoint_site_url, sharepoint_folder_path, output_filename, and even the CSV delimiter).
Here’s what I’ve done so far:
So now I am stuck — I have updated mashup.pq in the repo with dynamic parameters, but we don’t see any option to sync or import those changes back into the Fabric dataflow.
Could anyone advise if:
Any guidance or examples from others who’ve implemented something similar would be really helpful!
Thanks in advance,
Sharath
Solved! Go to Solution.
Thank you @Vinodh247 for quick response.
I am thinking to build a notebook for SharePoint so that i can make it dynamic.
Currently, Dataflow Gen2 doesn’t support dynamic destination paths or filenames (for example, parameterizing SharePoint site URL, folder path, or output file name). You can define parameters for source connections and query logic, but not for destination configuration — those settings must remain static.
Also, the Git integration in Fabric is one-way. Changes made in Fabric sync to Azure DevOps, but updates made directly in DevOps (like editing mashup.pq) don’t automatically sync back. The only supported method is to open the dataflow in Fabric, load the updated code manually, and click Save/Publish — this triggers a refresh of the dataflow definition.
Workarounds:
Keep a static destination in Dataflow, and use a Fabric Pipeline or Power Automate step afterward to move or rename the generated file dynamically.
Use parameters for other flexible items (like delimiter or site URL if you have multiple environments).
If dynamic destination control is critical, use Fabric Pipelines or Azure Data Factory Copy Activity, which support fully parameterized sink paths.
In short — parameterizing destination paths isn’t yet supported, and reverse Git sync (DevOps → Fabric) isn’t available. Manual save or post-processing automation is the only workaround for now.
Please Like and Mark as "Accepted Solution" if you find it Useful.
Hi @sharathKumar,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @Vinodh247 for the prompt response.
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Thank you @Vinodh247 for quick response.
I am thinking to build a notebook for SharePoint so that i can make it dynamic.
Hi @sharathKumar,
Yes that is correct but currently Dataflow Gen2 does not support fully dynamic SharePoint destinations like (path or filename).
The best way is to write the dataflow output to OneLake (lakehouse or file) and then use a fabric pipeline or notebook to upload or rename the file into SharePoint using runtime parameters.
Thanks and regards,
Anjan Kumar Chippa
Hi @sharathKumar,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi @sharathKumar,
We wanted to kindly follow up to check if the solution provided by the user for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Dynamic destination paths: Not supported yet in Dataflow Gen2. You can parameterize sources but destinations (like sharePoint paths, filenames or delimiters) are static in current Fabric dataflow gen2. Microsoft has not yet exposed destination parameters or runtime expressions for outputs, we can expect this soon I believe given the demand.
Syncing updated mashup.pq back from Git to Fabric: Currently one way only (Fabric --> Git). There are no supported reverse sync. Editing mashup.pq in devops does not reflect in fabric automatically. The only workaround is to do manually by opening the dataflow in fabric -> Advanced Editor -> replace the mashup code -> and then save and refresh.
Potential workaround that you can try...
Use a notebook or data pipeline after the dataflow to dynamically move/rename the output file based on parameters
Alternatively you can use a lakehouse or onelake destination and then orchestrate file movement or renaming using a fabric pipeline activity