Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hello Community,
We have been working on data flows in Service for quite some time. For some reason, data flows are not working consistently. I understand that the data which we transform in data flows gets stored in Azure data lake factory managed by Microsoft . (We do not have our own instance of ADLF ) .
I also read some where that the maximum memory space given for premium capacity users is 100 TB for data to be stored in ADLF . As we do not have our own instance of data lake, we are not aware how much space is being utilised. As we see data flows are not consistent , questions like complete utilisation of memory may be used are arised.
Could some one assist on how to check the memory space utilised ?
Thanks,
G Venkatesh
Hi @Anonymous,
Based on my investigations, I did not find a way to check the space usage. Maybe you can try to open a support ticket and see if they can help you if you are a PRO user.
Best Regards
Rena
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.