Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I am currently investigating the use of Power Automate and Dataflows for handling a scenario in which Excel files are regularly uploaded to a OneDrive folder, with each file containing a table that needs to be upserted to a Dataverse table. After a revealing exchange in a previous discussion, I have compiled a list of queries regarding the deployment and capability of Dataflows for this context:
Is it possible to configure Dataflows to automatically sync with a specified OneDrive folder on a continuous basis, so that any edit, deletion, or addition to the rows in any Excel file within this folder will trigger the Dataflow?
What are the limitations, if any, on the frequency of Dataflows executions within a day? Are Dataflows primarily triggered on a set schedule, or can they be configured to activate in response to changes detected in the OneDrive folder?
In comparison to the upsert function provided by the Dataverse Web API, how does the performance of Dataflows measure up, especially when dealing with large datasets containing millions of rows?
Considering the advice previously given, which involved identifying the table name in the 'List rows present in a table' action, how can one manage tables with names that are not consistent or prone to change? Is there a flexible approach to referencing these tables without resorting to hardcoded names?
The overarching objective is to establish a dependable and expandable system that ensures efficient data transfer from Excel to Dataverse, capable of catering to datasets ranging from several hundreds to millions of rows.
I eagerly await your suggestions and expert insights. Thank you for your contribution to this discussion.
1. yes
2. none if you use the API
3. As long as you keep the dataflow nimble (no fancy transforms or attempts at merges) performance should "measure up" - whatever that means in your environment. It sure outperforms Excel.
4. I'm pretty sure you can specify dynamic content in that dialog. It will become hairy, however, when these tables have different structure.
User | Count |
---|---|
118 | |
75 | |
60 | |
50 | |
44 |
User | Count |
---|---|
175 | |
125 | |
60 | |
60 | |
58 |