Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.
According to this doc, the Max memory (GB) column represents an upper bound for the semantic model size, and A1 capacity (F8 SKU is the same as A1 SKU) equals 3 GB. In the same doc, it also mentions that the refreshing action will take twice of memory as the dataset size. Does it mean that actually, the size of the dataset limit on A1 capacity is 1.5GB? Thanks!
Solved! Go to Solution.
Thanks for your reply. I just completed a meeting with a Power BI support engineer. He confirmed that the size limit of one dataset on A1 embedded capacity is 3 GB. Though the refreshing may take twice the memory, the 3GB memory limit will not be applied in that case. He also said he saw datasets with sizes close to 3GB refreshed without issues with the A1 capacity. Thus, I think the size of a dataset on A1 capacity is 3GB. We will also validate it in our dev environment tool
Thanks for your reply! We are doing tests. We may turn off the automatic refresh since the data will only be updated quarterly. The PBI doc says the data refreshing will double the usage of memory, but I am curious if this is true when data is refreshed by publishing from the PBI desktop.
Hi @greatzto ,
The refresh you mention is in import mode. For direct query or live conection mode you can totally break 1.5GB.The import mode essentially refreshes the semantic model, so it needs a copy, which means it needs at least double the capacity. And the latter two they refresh the underlying data source, the data itself is not in power bi , transferring the data through the gateway.
Hope it helps!
Best regards,
Community Support Team_ Scott Chang
If this post helps then please consider Accept it as the solution to help the other members find it more quickly.
Thanks for your reply. I just completed a meeting with a Power BI support engineer. He confirmed that the size limit of one dataset on A1 embedded capacity is 3 GB. Though the refreshing may take twice the memory, the 3GB memory limit will not be applied in that case. He also said he saw datasets with sizes close to 3GB refreshed without issues with the A1 capacity. Thus, I think the size of a dataset on A1 capacity is 3GB. We will also validate it in our dev environment tool
Hey @greatzto ,
most likely your assumption is not correct, because
You have to test it!
If you use automatic refresh, than it's likely that it will be 1.6GB or more, definitely not 3GB.
Once again: You have to test it
Regards,
Tom
Hey @greatzto ,
if you are not using incremental refresh then you have to reduce the max size of the semantic model by 50%.
The smaller the incremental batch, the closer you get to the max size.
Nevertheless, you also have to consider RAM that is requested because of users are interacting with the model, e.g. measures. This means, the more measures, and/or the more users the more RAM you have to "reserve."
Hopefully, this adds sime additional insights.
Regards,
Tom
Tom:
Thank you for the quick response. So, my assumption is correct? The actual dataset size limit is 1.5GB for A1 capacity when incremental refresh is not used.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Power BI update to learn about new features.