Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.

Replace Dataflow M Code (Keep Dataflow ID)

Ability to fully replace entire Dataflow M code (overwrite) without creating a new Dataflow, and without having to go into each entity/query one by one. This could be done via import of .json, IF import does NOT create new Dataflow (with new Dataflow ID, which it currently does). Basically import and replace type of functionality. It would enable continued/consistent Dataflow ID and links to Datasets to remain intact.
Status: Needs Votes
Comments
meurerc
New Member
I deal with this often - please enable this functionality
willem_don
New Member
Critical feature to allow for any kind of versioning
michael_ferris
New Member
Feels essential - we deploy the same content to multiple tenants and need to be able to get newer versions out there without having to either manually apply updates or reconfigure reports to use new dataflow IDs. Standard DevOps activity really - can update a report by republishing over the top of the original, so why not the same for dataflows?
salil_athalye
New Member
We need the ability to have real DTAP for dataflows that have linked entities. I have an ingestion dataflow workspace that is used by a transform dataflow workspace as linked entities, one pair for each stage in Dev / Test / Prod. Since the linked entities cannot be edited in the transform workspace, I can't parameterize the links so that they would move to point to dev / test / prod ingestion entities. Since everything seems to be captured in the same JSON file, if I import the dev JSON file into test it will require a lot of rework to reestablish all the links to point to test.
Arman_Syzdykov
New Member
Sort of a workaround while this feature is not available yet: You can select all queries from the source dataflow (either by CTRL+click or by SHIFT+click) and just copy and paste them to a target dataflow. You will still need to modify the parameters, but this might save you some time.
Javier_Blanco1
New Member
Please enable this funcionality in Power BI Service. All objects (datasets and reports) can be replaced except dataflows. It makes no sense and the "move to prod" takes a lot of time due to this point. Thanks in advance!
Imke
New Member
As a workaround, you can use Marcus Wegeners Publish2Dataflows-tool, that exports all queries from PBI Desktop into a dataflow. There is an option that allows you to overwrite an existing dataflow: https://github.com/MarcusWegener/Export2Dataflow/blob/main/publish2dataflow.pbitool.json
Scott_Thomson1
New Member

I have determined that the REST Import Call can be used to update an existing Dataflow with json text or script, but only if that dataflow was itself first created by a REST Import call. So you would have to create a dataflow, export it, import as new and then remove the first one. However if you push this "updateable" dataflow through a pipeline it loses its ability to be updated, so back to square one. Microsoft need to allow all dataflows to be updated vi REST call.

fbcideas_migusr
New Member
Status changed to: Needs Votes