Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I am using an OData feed as the data source in my report, connecting to entities from Dynamics 365 Finance and Operations.
Some of my projects have encountered issues during refresh after publishing to the Power BI Service.
Specifically, some entities cause refresh failures, and I receive the following error message:
"The XML for Analysis request timed out before it was completed. Timeout value: 7200" — sometimes this timeout value varies.
So I would like to understand what the actual issue is:
Is it that the entity itself is not efficient for retrieving data from D365 F&O?
Or is it that Power BI Service has limitations in handling large datasets?
Or is there a setting I can modify to increase the timeout in Power BI Service?
I already tried adjusting the timeout in Power Query using the following:
= OData.Feed("https://mydyn365.URL/data", null, [ Timeout = #duration(0, 3, 0, 0) ])
But it didn’t have any effect.
Solved! Go to Solution.
Hi @amal_01 ,
This is a pretty common pain point when refreshing Power BI datasets from D365 OData feeds.
The “XML for Analysis request timed out” error with a value like 7200 seconds is coming from the D365 OData service, not from Power BI itself. Power BI Service actually allows a longer timeout for refreshes (usually 2–5 hours), but if D365 cuts off the query after 7200 seconds (2 hours), Power BI can’t do anything about it. Changing the timeout in Power Query or M won’t help here, because the data source’s own limit always wins.
Try to filter your queries as much as possible, or break them into smaller parts so each one finishes faster. If you’re using Premium, consider setting up Incremental Refresh, this can really help with large datasets. You can ask your D365 admin if the OData timeout can be raised, but that’s rarely done in production. Some teams use Azure Data Factory or Power BI Dataflows to stage the data elsewhere first, especially if dealing with massive volumes.
The root cause is the timeout on the D365 side. Focusing on query optimization or incremental loading is usually the best way forward.
Hi @amal_01
As we haven’t heard back from you, we wanted to kindly follow up to check if the suggestions provided by the community members for the issue worked. Please feel free to contact us if you have any further questions.
Thanks and regards
Hi @amal_01 ,
May I check if this issue has been resolved? If not, Please feel free to contact us if you have any further questions.
Thank you
Hi @amal_01
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions.
Thank you.
Hi @amal_01 ,
Thanks for reaching out to the Microsoft fabric community forum.
Scheduled refreshes for imported semantic models will time out after two hours. For semantic models in Premium workspaces, this limit is extended to five hours. If you reach this time-out, try reducing the size or complexity of your semantic model, or split the large model into several smaller ones.
Troubleshoot refresh scenarios - Power BI | Microsoft Learn
I hope this information helps. Please do let us know if you have any further queries.
Thank you
Hi @amal_01 ,
This is a pretty common pain point when refreshing Power BI datasets from D365 OData feeds.
The “XML for Analysis request timed out” error with a value like 7200 seconds is coming from the D365 OData service, not from Power BI itself. Power BI Service actually allows a longer timeout for refreshes (usually 2–5 hours), but if D365 cuts off the query after 7200 seconds (2 hours), Power BI can’t do anything about it. Changing the timeout in Power Query or M won’t help here, because the data source’s own limit always wins.
Try to filter your queries as much as possible, or break them into smaller parts so each one finishes faster. If you’re using Premium, consider setting up Incremental Refresh, this can really help with large datasets. You can ask your D365 admin if the OData timeout can be raised, but that’s rarely done in production. Some teams use Azure Data Factory or Power BI Dataflows to stage the data elsewhere first, especially if dealing with massive volumes.
The root cause is the timeout on the D365 side. Focusing on query optimization or incremental loading is usually the best way forward.
The default timeout value for Semantic Model refresh is 18000. AFAIK that is not something you can change in the admin portal. This would lead me to believe that the 7200 is coming from D365.
Adjusting the timeout in your query (10800 in your case) requires the cooperation (or tolerance) of the data source to be effective. Enforced data source timeouts override your requests.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
Check out the November 2025 Power BI update to learn about new features.