Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
Hi,
I have a data flow that reads data from a source and writes it to a destination.
If the data in an existing record changes or a new record appears, the options in Setting -> Update method work fine: Allow upsert, insert and update.
But if records disappear from the source, they are not deleted in the destination. I have the ‘Allow delete’ option checked.
Any idea why this might be failing?
Thanks in advance!
Solved! Go to Solution.
Hi @AnHell ,
Thank you for the update.
Please try below structure to your data flow.
1. In Source 1, Your filtered view from Fabric warehouse returns 1 record. In Source 2, The Dataverse destination table with 10 records.
2. Exists Transformation, Configure to find records in Dataverse (Source 2) that do not exist in Source 1. This identifies records that should be deleted.
3. Alter Row Transformation, Use deleteIf(true()) for all rows coming from the Exists transformation.
4. Sink Transformation, Connect to Dataverse. Please enable Allow delete and Allow upsert. Set the Alternate Key Name to the correct primary key field.
Note: Make sure the key column is defined so ADF can match rows for deletion.
5. Run in a Pipeline, Not Just Debug Mode. Deletes won’t execute in debug mode. You must run the data flow inside a pipeline using an Execute Data Flow activity.
Please refer below links.
Sink transformation in mapping data flow - Azure Data Factory & Azure Synapse | Microsoft Learn
in dataflow the update method allow delete is not working - Microsoft Q&A
Data Flow - "Delete If" setting in Alter Row - Microsoft Q&A
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
Hello,
Ultimately, the issue was that I was not doing it correctly.
When you use a destination in the data flow and select the options to allow insert, upsert, delete, etc., you must first include an ‘alter row’ where you define the conditions that will allow each of these actions. In other words, you have to define what causes an insert, upsert, delete, etc., and then allow it in the final step. If you don't do both, it doesn't work.
Thank you all!
Hi @AnHell ,
In Azure Data Factory (ADF) or Synapse Data Flows, the “Allow delete” option in the sink settings does not work on its own. It requires a corresponding “Alter Row” transformation to explicitly define which rows should be deleted.
Please try below Steps to Enable Deletes in Data Flow.
1. Add an “Alter Row” transformation before the sink.
2. In the Alter Row transformation, define the condition for deletion by using expression like below.
iif(isNull(sourceColumn), delete())
Note: By using above syntax, if a record is missing in the source, it should be deleted from the destination.
3. In the Sink settings, enable "Allow delete", Optionally Allow insert, update, upsert depending on your use case.
4. Check your sink dataset supports deletion like SQL-based sinks with primary keys.
Please refer below link.
Alter row transformation in mapping data flow - Azure Data Factory & Azure Synapse | Microsoft Learn
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
Hi @AnHell ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.
Regards,
Dinesh
Hi @v-dineshya ,
I'm sorry, I haven't managed to get it working yet, but the situation has changed slightly.
Here's where I stand.
I have a dataflow that reads from a view in a Fabric warehouse. The view is filtered to always return the same record, just one.
The destination is a table in Dataverse with 10 records.
The primary key field from the source is written to the primary key field in the destination.
Before the sink, I have an alter row box where the condition for both the DELETE IF and UPSERT IF statements is set to true().
In the sink's Settings tab, I have "Allow delete" and "Allow upsert" checked, and in the Alternate key name dropdown, I've selected the primary key field from the destination.
When I run this dataflow, I expected all the records in the destination to disappear, leaving only this single record, but nothing is deleted.
I'm still investigating what I'm doing wrong, but I can't find the problem.
Thanks!
Hi @AnHell ,
Thank you for the update.
Please try below structure to your data flow.
1. In Source 1, Your filtered view from Fabric warehouse returns 1 record. In Source 2, The Dataverse destination table with 10 records.
2. Exists Transformation, Configure to find records in Dataverse (Source 2) that do not exist in Source 1. This identifies records that should be deleted.
3. Alter Row Transformation, Use deleteIf(true()) for all rows coming from the Exists transformation.
4. Sink Transformation, Connect to Dataverse. Please enable Allow delete and Allow upsert. Set the Alternate Key Name to the correct primary key field.
Note: Make sure the key column is defined so ADF can match rows for deletion.
5. Run in a Pipeline, Not Just Debug Mode. Deletes won’t execute in debug mode. You must run the data flow inside a pipeline using an Execute Data Flow activity.
Please refer below links.
Sink transformation in mapping data flow - Azure Data Factory & Azure Synapse | Microsoft Learn
in dataflow the update method allow delete is not working - Microsoft Q&A
Data Flow - "Delete If" setting in Alter Row - Microsoft Q&A
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
Hi, @v-dineshya
That was the problem! ... I wasn't using the exit transformation.
I finally fixed it by adding the sink as a source that I use to do a join and detect the null values that I need to delete. But with the exit transformation, I find it easier to implement.
Thank you very much for everything!
Hello,
I need to try this option with my colleague, who is out of the office. I am waiting to try it and hope to be able to reply to you that this was the solution.
Thank you!
Hi @AnHell ,
Please Provide ETA (Estimated Time for Arriaval) for the response to this thread.
Regards,
Dinesh
Hi @AnHell ,
Thank you for reaching out to the Microsoft Community Forum.
Hi @tayloramy , Thank you for your prompt response.
Hi @AnHell , Could you please try the proposed solution shared by @tayloramy ? Let us know if you’re still facing the same issue we’ll be happy to assist you further.
Regards,
Dinesh
Hi @AnHell,
Are you using Dataflow Gen 1, or Dataflow Gen 2?
If you're using dataflow gen 2, you need to change the update method of the target connection tp Replace:
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.