The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
I have a Power Apps Dataflow to get information from Dynamics 365 Business Central that is using an OData connection. It has automatic refreshes scheduled 5 times a day. You would think a connection to another Microsoft platform within the same Microsoft 365 tenant would be stable, but it is not. The dataflow is erratic. For instance, it might fail at 8AM, be fine for 2 days and 9 syncs later, and then fail at 11AM on the 3rd day.
There are no other dataflows in the environment. We have the data connection pointing to a development Business Central environment so little to no user interaction with Business Central environment. We have the dataflow in a development Power Platform environment; again little to no user interaction.
Any ideas why a dataflow would be this erratic and fail sometimes, but not always?
Hello -
You should not use the OData endpoint for big volumes since it's very slow. For such scenarios it's recommended to use BYODB (export the entity to your own Azure SQL and then point Power BI to read from this Azure SQL database).
Or you could try Incremental Refresh in Power BI Dataflows.
generally speaking, we do not expect to get a blazing fast data refresh performance over OData connections, the best solution is to migrate the data into an intermediary repository, such as Azure SQL Database or Azure Data Lake Store or even a simple Azure Storage Account, then connect from Power BI to that database.
Optimising OData Refresh Performance in Power Query for Power BI and Excel - BI Insight