Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
I am wanting to work on data stored in Azure SQL in the Dataverse, which is easy to do using a dataflow. I LOVE dataflows and am wanting to go no-code/low-code on this as much as possible. However, I want to then pass the data back to the Azure SQL DB and update any records that were modified by the dataflow. Is it possible to set up some sort of automated sync between the Dataverse and an Azure SQL DB? I know there is the Azure Synapse Link option, but that suggests to me that it is simply copying (rather than updating) data.
The datasets in question could range from anywhere from 1000 to 10k to 100k items, so I need the most performant solution possible.
Any Power Platform gurus out there with ideas?
DataFlow will work in one way bringing data from external source to dataverse and that too will work only for data. If any metadata changes occurs that will not work using dataflows.
Azure synapse is to send dataverse data to synapse for analytics, to store in a data lake or to send data to external source using ADF. Only works for table if change tracking has been enabled.
Since in your case it has to work in both the ways. I would recommend,
Virtual table, since Azure SQL has ODATA provider. It is easy to configure and handle external data's in powerapps without any code. Make sure Id in external datasource in GUID format. Attached link for your reference (https://docs.microsoft.com/en-us/powerapps/maker/data-platform/virtual-entity-odata-provider-require...
2. Or try azure data factory, pull data from dataverse push it to azure sql server and vice-versa (This setup would be bit complex). Thinking very broadly on this.
If it answers your question, Kindly give kudo and accept it as solution.
Regards,
Prakash
Thanks for the replies, everyone! Apologies on the delayed reply.
@Prakash4691! Is it possible to do this with Virtual tables created using Microsoft's newer Virtual Connector from App Source?
@arpost ,
Correct me if I am wrong.
The one you are looking for is azure sql not for on prem sql server right?
If it is on prem, then you can try that one. FYI, that is still in preview. I will not recommend to use it in production.
If it answers your question, kindly give kudo and accept it as solution.
Regards,
Prakash
Hi @rampprakash ,
Yes, it handles upsert functionality as well. Please see the following: https://www.blog.allandecastro.com/bringing-your-dataverse-data-to-azure-synapse/
Hopefully this is what you were asking.
Thanks,
Drew
Hi @arpost ,
You might want to check out the process to take Dataverse data to Azure Synapse with the following process: https://cloudblogs.microsoft.com/powerplatform/2021/05/26/accelerate-time-to-insight-with-azure-syna...
Microsoft also has a Data Export Service but this requires licensing for Dynamics 365 (https://docs.microsoft.com/en-us/power-platform/admin/replicate-data-microsoft-azure-sql-database) and even this article recommends the first article as the approach 😀
Hope this helps. Please accept if answers your question or Like if helps in any way.
Thanks,
Drew
Hello @dpoggemann,
It's curious to ask this, is it possible to compare and update the record based on filtering ? Like upsert operation?
Hello @arpost,
I can suggest you to go with Microsoft Flow,
1. Create a Field in Azure SQL DB to Store Unique Values
2. Create a Field in DataVerse with the same DataType as Created in Step 1.
3. Create 2 Flows
a. Create a FLOW Trigger when a record in Created/Updated in Azure DB
--> Use List Rows and Check the CREATED SQL Field (Step 1 ) in Step 2 (Dataverse)
--> If Available Update else Create
b. Create a Flow Trigger when a record in Updated in Dataverse
--> Use List Rows and and retrieve the Record from SQL DB
--> then Update
Please mark as Answer if it is helpful and provide Kudos
Subscribe : https://www.youtube.com/channel/UCnGNN3hdlKBOr6PXotskNLA
Blog : https://microsoftcrmtechie.blogspot.com
Can think of 2 options
OOB filter
Haven't actually tested but I reckon the Custom Tracking Id function can be used / appended. If not setting some sort of identifier means you can just use a trigger filter and ignore based on the tracking ID (or some other random field set to identify the change). You could also connect with a service principal user specifically setup for this synchronisation and filter / close based on that.
Condition Filter
After receiving the request, if it's possible to identify the source without doing anything else you could stop the flow.
Alternatively you would need to get the target record and compare values.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.