The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
This error always occurs when the update is older than 2 hours
error log:
{"error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","pbi.error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode","detail":{"type":1,"value":"-2147467259"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"Tempo limite expirado. O período de tempo limite transcorreu antes da conclusão da operação."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":{"type":1,"value":"-2147467259"}},{"code":"Microsoft.Data.Mashup.HostingError.Reason","detail":{"type":1,"value":"Timeout"}}],"exceptionCulprit":1}}}
Solved! Go to Solution.
Hi @JoséLisboa ,
The standard timeout refresh is 2 hours. So, this is probably a system thing. This can be changed - if you purchase a Premium workspace or Premium capacity. If that is not an option. then you can reduce the query. Perhaps break it into multiple queries or you can use the Performance Analyzer to see if your query has some ineffeciencies.
Proud to be a Datanaut!
Private message me for consulting or training needs.
your solution is great, @collinq .
Hi @JoséLisboa , based on the reply already given, I will give a more specific description , and I would like to share it for reference:
As it says in the link, Data refresh in Power BI - Power BI | Microsoft Learn, the maximum refresh time is 2 hours in shared capacity and up to 5 hours in premium capacity, and you can also bypass the five hour limit with XMLA endpoint. At the bottom of the link just given you provide the best practices of refresh, such as: Schedule your refreshes for less busy times, especially if your semantic models are on Power BI Premium. If you distribute the refresh cycles for your semantic models across a broader time window, you can help to avoid peaks that might otherwise overtax available resources. consider using DirectQuery/LiveConnect mode instead of Import mode if the increased load at the source and the impact on query performance are acceptable
Given that you're refreshing in shared capacity, you can consider reducing the size or complexity of your semantic model, or consider refactoring the large semantic model into multiple smaller semantic models. Optimize your semantic models to include only those tables and columns that your reports and dashboards use. How to manage the large datasets in Microsoft Power BI | by Nadakkannu Kuthalaraja | Medium This article emphasizes several key strategies on how to optimize semantic models: Efficient Data Modeling, Data Reduction, Performance Optimization and so on.
Best Regards,
Zhu
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
your solution is great, @collinq .
Hi @JoséLisboa , based on the reply already given, I will give a more specific description , and I would like to share it for reference:
As it says in the link, Data refresh in Power BI - Power BI | Microsoft Learn, the maximum refresh time is 2 hours in shared capacity and up to 5 hours in premium capacity, and you can also bypass the five hour limit with XMLA endpoint. At the bottom of the link just given you provide the best practices of refresh, such as: Schedule your refreshes for less busy times, especially if your semantic models are on Power BI Premium. If you distribute the refresh cycles for your semantic models across a broader time window, you can help to avoid peaks that might otherwise overtax available resources. consider using DirectQuery/LiveConnect mode instead of Import mode if the increased load at the source and the impact on query performance are acceptable
Given that you're refreshing in shared capacity, you can consider reducing the size or complexity of your semantic model, or consider refactoring the large semantic model into multiple smaller semantic models. Optimize your semantic models to include only those tables and columns that your reports and dashboards use. How to manage the large datasets in Microsoft Power BI | by Nadakkannu Kuthalaraja | Medium This article emphasizes several key strategies on how to optimize semantic models: Efficient Data Modeling, Data Reduction, Performance Optimization and so on.
Best Regards,
Zhu
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @JoséLisboa ,
The standard timeout refresh is 2 hours. So, this is probably a system thing. This can be changed - if you purchase a Premium workspace or Premium capacity. If that is not an option. then you can reduce the query. Perhaps break it into multiple queries or you can use the Performance Analyzer to see if your query has some ineffeciencies.
Proud to be a Datanaut!
Private message me for consulting or training needs.
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Power BI update to learn about new features.
User | Count |
---|---|
51 | |
26 | |
14 | |
14 | |
12 |
User | Count |
---|---|
108 | |
39 | |
24 | |
23 | |
19 |