March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hello team,
Hope you all are doing good. I'm facing a problem of Timed Out error in service. I have a pro account and when I initially refresh the dataset on service, it gives "Timed Out" error after 2 hours. So what are the ways to refresh the data on service?
Thank you for your efforts in advance..
Hoping someone can help with this. Here is a screenshot the number of refresh attempts. As you can the whole process took around 2 hours
Hi,
Apologies if I am brining up an old topic. I am facing a similar issue. How did you manage to resolve yours ?
Hi @Uniqueusername ,
In our case the data was significantly big and as the refresh occurs in a shared memory in pro account, we moved towards premium account which gave us a dedicated memory to refresh the report after consulting with Microsoft Support team.
thanks for that. We are on a premium subscription as well. Were you facing the slow refresh times while scheduling auto refresh through the gateway ?
I have been noticing my reports taking a lot longer to refresh for some reason as compared to some of the other reports which are on the same workspace on the gateway. My reports seem to take for hours compared to a forced, on-demand refresh taking a few minutes
The storage mode my model has is a mixed with DirectQuery , dual and import modes
I'm having issues this morning with timeouts on publishing from the desktop to the BI service. Anyone else seeing this?
Hi. Power Bi Pro has a limit of 2 hours to refresh your data.if you exceed that time you can't continue. This can happen for different reasons like bandwitdh (if you have a gateway) or dataset size.
You have to consider reading about downsizing data model. You can find amazing youtube videos from guy in a cube and others in order to make your model smaller. Usually making a good star schema helps.
Another thing you can check is if your source is returning the data fast. You might have a big crazy native query to the source that takes a lot. Those scenarios might need a middle store procedure building a table in the data base engine in order to help Power Bi only reading the resulted table and avoiding running a complex query.
If you are on premise you can also check bandwidth but that might be the last option because the first two I have mentioned are the most common issues.
I hope that make sense,
Happy to help!
Is this limitation applied also on power BI server on premise ( which installed on our sql server) ?
Hi @ibarrau , Thank you so much for your reply.
Everything you mentioned makes sense and I have taken care of every point you have mentioned and also the dataset has a large amount of data. Is there any other way by which I can refresh the dataset?
If you are sure you have an amazing star schema connected to a source that runs fast single tables (not native query with logics) and you still have 2 hours limit, then you should consider a different license like PPU (premium per user) or moving dataset to Analysis Services. There is also a premium license. If you are sure you have optimized all then the limit can't be skipped, that's showing you that Pro License is not enought for your model.
I know incremental refresh is now available for Pro, but I'm not sure if that will let you skip this limit. You can try it if you have a DB source with Query folding (SQL server, Synapse, postgress, etc).
I hope that helps,
Happy to help!
Thanks a lot for your response. So, does this mean that the only way to counter the 2hrs limit is to move to a higher license? If yes, will moving to Power BI Embedded help?
And also I'm facing one more weird issue which is when I publish the dataset (Incremental Refresh is configured) to the service then usually the initial refresh should be triggered on its own but this doesn't seem to happen currently...
If you want to expend the 2 hours limit yes, you need a new license. You can do it with embed, I think that AAS is a cheaper option (but it includes migration) or if there are not so much people shared PPU might be cheaper too.
Regarding the second question I'm sorry but I'm not sure about the answer. However I think there Patrick from Guyinacube youtube channel mentioned that in one of his videos. You might want to check his incremental refreshes videos before jumping to embed capacity.
Regards
Happy to help!
Thanks a lot for your response. However when I publish the dataset, the auto-refresh triggers, and the refresh continues for 2 hours and then gives a time-out error. When I refresh it (On demand) again, the dataset gets refreshed successfully. So every time the first refresh gives a time-out error and the second refresh successfully gets executed. Doesn't it seem to be behaving weirdly?
Did you ever fix this issue ?
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
40 | |
26 | |
17 | |
11 | |
10 |
User | Count |
---|---|
54 | |
51 | |
23 | |
12 | |
11 |