March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Are there any risks/concerns with large data refreshes of datasets on the capacity? Can a refresh cause a slowdown/crash of the capacity itself
I understand that there are size capacities (e.g., 1gb on Pro or 100 TB total against a capacity) and maximum amount of times refreshes can occur, but I'd like to understand the impact it can have on the capacity.
For example, if I were to have a dataset that is currently 600mb in size and it is scheduled to refresh daily at 12pm. 1gb of additional data is added into the dataset at 10am (so total size is now 1.6gb). During the 12pm scheduled refresh, are there any risks involved aside from time it takes to refresh on the capacity itself. e.g., can it slow down the service or crash it?
Thanks
Hey @Morkil ,
your decribed scenario can cause a slow down of your tenant. But Service organize (compute resources) a Schedule refresh by "itself".
So normally you should´t feel an impact (slow down) for your tenant.
I don´t feel such impacts during refreshing processes.
Regards
P.S. 1,6GB dataset is not a big one 🙂
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
37 | |
22 | |
20 | |
10 | |
9 |
User | Count |
---|---|
60 | |
56 | |
22 | |
14 | |
12 |