Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Scenario :
P1 SKU Premium Capacity
Max. Mem limit : 25 GB
CPU Parallel refresh limit : 6 at a time
Total Datasets refreshed daily in Prod @ 9:00 am EST : 10
All datasets are of 1 gb size except for one big dataset which is of 23 gb . The setting large dataset has been enabled for the 23gb dataset and it is scheduled at @9:00 am EST .
Questions:
1. How does the big dataset executes on this capacity when the total max gb allocated is 25gb?
2. If all are scheduled at the sametime which takes preference ? Smaller ones if so what order if they have the same data refresh schedule set ( 9:00 am est daily)
Solved! Go to Solution.
Hi @NewUser777 ,
The 23 GB dataset will not be able to refresh. Refreshing causes the memory used by the dataset to double:
djurecicK2 Finally got to know the truth , the dataset which was over 20GB was refreshed manually via SSMS . From Power BI service like you have stated it didnt work .
I will close this thread and thanks for your help
https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-connect-tools#dataset-refresh
The XMLA endpoint enables a wide range of scenarios for fine-grain refresh capabilities using SSMS, automation with PowerShell, Azure Automation, and Azure Functions using TOM. For example, you can refresh certain incremental refresh historical partitions without having to reload all historical data.
Unlike configuring refresh in the Power BI service, refresh operations through the XMLA endpoint are not limited to 48 refreshes per day, and the scheduled refresh timeout is not imposed.
Date, time, and status for dataset refresh operations that include a write transaction through the XMLA endpoint are recorded and shown in dataset Refresh history.
Hi @NewUser777 ,
The 23 GB dataset will not be able to refresh. Refreshing causes the memory used by the dataset to double:
@djurecicK2 I have a model which is over 20gigs and it runs in this capacity and takes almost 3-4 hours to complete . The setting on the model I see is enable large dataset format.Hence I m wondering how would this run in this capacity which imposes 25 gb limit .
djurecicK2 Finally got to know the truth , the dataset which was over 20GB was refreshed manually via SSMS . From Power BI service like you have stated it didnt work .
I will close this thread and thanks for your help
https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-connect-tools#dataset-refresh
The XMLA endpoint enables a wide range of scenarios for fine-grain refresh capabilities using SSMS, automation with PowerShell, Azure Automation, and Azure Functions using TOM. For example, you can refresh certain incremental refresh historical partitions without having to reload all historical data.
Unlike configuring refresh in the Power BI service, refresh operations through the XMLA endpoint are not limited to 48 refreshes per day, and the scheduled refresh timeout is not imposed.
Date, time, and status for dataset refresh operations that include a write transaction through the XMLA endpoint are recorded and shown in dataset Refresh history.
Check out the October 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.