Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello Community,
We have a PPU license and model ~40GB. We also have ~370 users and we took it into consideration option for shift to
Fabric F64 or (ex P1).
But we have some question regarding model size.
In PPU we have 100GB for model and in F64 only 25GB, so if we good understand license options than F64 is not a option.
On the other hand in F64 we can make DataWarehouse and on top of that DWH there is automatically dataset and also we can produce many datasets from that DWH.
But how big DWH can be in F64? Is it also limited to 25 GB? If not, and if DWH can be more than 25GB what's happen
with size of dataset which is automatically created. And whats happen with other datasets which can be created from that DWH, are they still limited on 25GB?
Does anyone have experience with this?
Thank You in advance
Solved! Go to Solution.
Yup, PPU is equivalent to F256. So moving to P1 will not work.
I think you are referring to Lakehouse (not DWH, but yeah semantics :D). The auto dataset created is gonna be on DirectLake mode. So you will need to review the limitation on the mode.
https://learn.microsoft.com/en-us/fabric/get-started/direct-lake-overview#fallback
AFAIK (since things are changing all the time), there's no limit to storage/size of the Lakehouse nor the Semantic model with DirectLake, but there is a limit on the DirectLake compute/query, and if you go over the limit, it will fallback to Direct query (which supposedly to be slower than DirectLake).
Please do note that there could be a cost for the storage of the lakehouse
https://azure.microsoft.com/en-ca/pricing/details/microsoft-fabric/
Yup, PPU is equivalent to F256. So moving to P1 will not work.
I think you are referring to Lakehouse (not DWH, but yeah semantics :D). The auto dataset created is gonna be on DirectLake mode. So you will need to review the limitation on the mode.
https://learn.microsoft.com/en-us/fabric/get-started/direct-lake-overview#fallback
AFAIK (since things are changing all the time), there's no limit to storage/size of the Lakehouse nor the Semantic model with DirectLake, but there is a limit on the DirectLake compute/query, and if you go over the limit, it will fallback to Direct query (which supposedly to be slower than DirectLake).
Please do note that there could be a cost for the storage of the lakehouse
https://azure.microsoft.com/en-ca/pricing/details/microsoft-fabric/
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!