Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
Hi,
I'm creating some workspaces that contain data sources located on sharepoint. I'd like to set up some scheduled refreshes for these datasets.
What happens if the dataset size grows extremely huge in the intermission between a refresh? Will this have an effect (e.g., crash) on the server?
Thanks
SharePoint Online will throttle data pulls so as to not break, I'm unsure if SharePoint server does this as well (I'd assume so, but perhaps less). You're more likely to get a timeout error with large SharePoint datasets than anything if they're extremely large.
Would there be any affect (crashes etc) against the capaccity the dataset is hosted on (e.g., if a workspace is pro or premium capacity)?
Hi @Morkil
It will have an affect on the refresh time if the dataset size increases so will the refresh time. Refershing datasets can also in some cases affect Report performance if the Power BI capacity is close to max.
Is there any affect on the capacity hosted on (premium capacity or pro?). Wondering if a refresh of a huge amount of added data can cause some type of crash?
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.