Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Hello Community,
We have been working on data flows in Service for quite some time. For some reason, data flows are not working consistently. I understand that the data which we transform in data flows gets stored in Azure data lake factory managed by Microsoft . (We do not have our own instance of ADLF ) .
I also read some where that the maximum memory space given for premium capacity users is 100 TB for data to be stored in ADLF . As we do not have our own instance of data lake, we are not aware how much space is being utilised. As we see data flows are not consistent , questions like complete utilisation of memory may be used are arised.
Could some one assist on how to check the memory space utilised ?
Thanks,
G Venkatesh
Hi @Anonymous,
Based on my investigations, I did not find a way to check the space usage. Maybe you can try to open a support ticket and see if they can help you if you are a PRO user.
Best Regards
Rena