Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
Hello Community,
We have been working on data flows in Service for quite some time. For some reason, data flows are not working consistently. I understand that the data which we transform in data flows gets stored in Azure data lake factory managed by Microsoft . (We do not have our own instance of ADLF ) .
I also read some where that the maximum memory space given for premium capacity users is 100 TB for data to be stored in ADLF . As we do not have our own instance of data lake, we are not aware how much space is being utilised. As we see data flows are not consistent , questions like complete utilisation of memory may be used are arised.
Could some one assist on how to check the memory space utilised ?
Thanks,
G Venkatesh
Hi @Anonymous,
Based on my investigations, I did not find a way to check the space usage. Maybe you can try to open a support ticket and see if they can help you if you are a PRO user.
Best Regards
Rena
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.