Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
JoséLisboa
New Member

Error on scheduled refresh

This error always occurs when the update is older than 2 hours

error log:
{"error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","pbi.error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode","detail":{"type":1,"value":"-2147467259"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"Tempo limite expirado. O período de tempo limite transcorreu antes da conclusão da operação."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":{"type":1,"value":"-2147467259"}},{"code":"Microsoft.Data.Mashup.HostingError.Reason","detail":{"type":1,"value":"Timeout"}}],"exceptionCulprit":1}}}

2 ACCEPTED SOLUTIONS
collinq
Super User
Super User

Hi @JoséLisboa ,

 

The standard timeout refresh is 2 hours.  So, this is probably a system thing.  This can be changed - if you purchase a Premium workspace or Premium capacity.  If that is not an option. then you can reduce the query.  Perhaps break it into multiple queries or you can use the Performance Analyzer to see if your query has some ineffeciencies.




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




View solution in original post

v-linhuizh-msft
Community Support
Community Support

your solution is great, @collinq .

 

Hi @JoséLisboa , based on the reply already given, I will give a more specific description , and I would like to share it for reference:


As it says in the link, Data refresh in Power BI - Power BI | Microsoft Learn, the maximum refresh time is 2 hours in shared capacity and up to 5 hours in premium capacity, and you can also bypass the five hour limit with XMLA endpoint. At the bottom of the link just given you provide the best practices of refresh, such as: Schedule your refreshes for less busy times, especially if your semantic models are on Power BI Premium. If you distribute the refresh cycles for your semantic models across a broader time window, you can help to avoid peaks that might otherwise overtax available resources.  consider using DirectQuery/LiveConnect mode instead of Import mode if the increased load at the source and the impact on query performance are acceptable

 

Given that you're refreshing in shared capacity, you can consider reducing the size or complexity of your semantic model, or consider refactoring the large semantic model into multiple smaller semantic models. Optimize your semantic models to include only those tables and columns that your reports and dashboards use. How to manage the large datasets in Microsoft Power BI | by Nadakkannu Kuthalaraja | Medium  This article emphasizes several key strategies on how to optimize semantic models: Efficient Data Modeling, Data Reduction, Performance Optimization and so on.

 

Best Regards,
Zhu
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

View solution in original post

2 REPLIES 2
v-linhuizh-msft
Community Support
Community Support

your solution is great, @collinq .

 

Hi @JoséLisboa , based on the reply already given, I will give a more specific description , and I would like to share it for reference:


As it says in the link, Data refresh in Power BI - Power BI | Microsoft Learn, the maximum refresh time is 2 hours in shared capacity and up to 5 hours in premium capacity, and you can also bypass the five hour limit with XMLA endpoint. At the bottom of the link just given you provide the best practices of refresh, such as: Schedule your refreshes for less busy times, especially if your semantic models are on Power BI Premium. If you distribute the refresh cycles for your semantic models across a broader time window, you can help to avoid peaks that might otherwise overtax available resources.  consider using DirectQuery/LiveConnect mode instead of Import mode if the increased load at the source and the impact on query performance are acceptable

 

Given that you're refreshing in shared capacity, you can consider reducing the size or complexity of your semantic model, or consider refactoring the large semantic model into multiple smaller semantic models. Optimize your semantic models to include only those tables and columns that your reports and dashboards use. How to manage the large datasets in Microsoft Power BI | by Nadakkannu Kuthalaraja | Medium  This article emphasizes several key strategies on how to optimize semantic models: Efficient Data Modeling, Data Reduction, Performance Optimization and so on.

 

Best Regards,
Zhu
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

collinq
Super User
Super User

Hi @JoséLisboa ,

 

The standard timeout refresh is 2 hours.  So, this is probably a system thing.  This can be changed - if you purchase a Premium workspace or Premium capacity.  If that is not an option. then you can reduce the query.  Perhaps break it into multiple queries or you can use the Performance Analyzer to see if your query has some ineffeciencies.




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.