Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now

Reply
Tanayraj
Frequent Visitor

My Capacity storang number is absurd. Your organization's compute capacity has exceeded its limit

Hello,

 

I have created a warehouse using dataflow gen2. I have a trial fabric capacity (F64). A few days ago, I tried to open my warehouse, and I got a message when  "Unable to complete the requested action because your organization's compute capacity has exceeded its limit" 

 

The warehouse has 10 tables. Data gets populated by 10 dataflows. There are 4 to 5 lightweight power query steps before loading into the data warehouse. The total number of rows is under 6 Million (The sum of all the tables).  I am using this structure for more than 5 months. But last week got the error message which I mentioned above. I am going through Microsoft documentation and I installed the "Fabric Capacity Metrics" App to monitor my capacity. When I looked in the storage I saw absurd numbers. I have attached a screenshot (Scrennshot-1) below. It is showing the size of my workspace which has a warehouse is "640 GB" Is it even possible? My db size is not this big, where I am getting data from. Then I used "Microsoft OneLake file explorer for Windows" to check the size and it showed the size of my workspace was below 15 GB. It is still high but possible (Check Screenshot-2). 

if Microsoft OneLake file explorer for Windows is not the right way to check the size of the workspace then still on why and how  640 GB on Fabric Capacity Metrics.

Screenshot-1.pngScreenshot-2.png

Can anyone answer, why 6 Million rows in the warehouse using 640 GB?  

2 REPLIES 2
hackcrr
Super User
Super User

Hi, @Tanayraj 

First, regarding the 640 GB storage size shown in the Fabric Capacity Metrics application, this sounds unrealistic, especially if your data source size totals less than 6 million rows and you're only using data warehouses and data streams. There could be several reasons for this situation:
1. There may be an error in the Fabric Capacity Metrics application or the metrics displayed may not correspond exactly to the capacity you are actually using.
2. If you have multiple workspaces in your tenant and they share the same Fabric capacity, the activity in the other workspaces may also be affecting your capacity usage.
3. The Power BI backend may perform tasks such as snapshots, caching, indexing, etc., which may temporarily increase capacity usage.
4. If the data stream is refreshed very frequently or the query load is very high, this may also cause a spike in capacity usage.
5. Power BI may store logs and temporary files, which can also consume capacity
For the 15 GB size shown by Microsoft OneLake File Explorer that you mentioned, this sounds closer to the actual size of your data warehouse. However, please note that this tool may only show specific types of files or data and is not fully representative of the capacity usage of the entire workspace.
To resolve this issue, you can try the following steps:
Re-check the capacity allocation to ensure that your F64 capacity is not being over-utilized by other workspaces or tenants.
Use the Fabric Capacity Metrics app to continuously monitor capacity usage and see if there are any unusual patterns or spikes.
If you your problem persists and you think your capacity usage is unreasonable, you can open a ticket by clicking on the link below, which will have a dedicated Microsoft engineer check your capacity issue for you:

https://admin.powerplatform.microsoft.com/newsupportticket/powerbi

 

 

Best Regards,

hackcrr

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hello and thank you @hackcrr ,

 

I completely agree with you, for 6 million rows taking 630GB is unrealistic.

I have triple-checked the capacity ID and Fabric Capacity Maertics.  The numbers displayed in the metrics correspond with the capacity I am using.

 

I was going through Fabric Capacity Martics, It shows there is one process that is occupying 3889422.248 CUs This number also seems unrealistic. 300% utliatation of CU. The things no one in my organization started that process but it shows this process is running under xyz user. Please check the Screenshot. 1,3 and 2. Once this process started we were not able to access any workspace, dataware house, dataflow, or anything. We had to wait 2 days to do anything. Everything was frozen for 2 days. We stopped every refresh as well.

I do not know what the size was before that process started. But do you think because of that process storage size went up to 640GB? 

That process is finished and can access everything but I haven't seen any kind of change. in my workspace or data warehouse.  

 

 

 

 

Screenshot_1.pngScreenshot_2.pngScreenshot_3.png

 

Helpful resources

Announcements
OCT PBI Update Carousel

Power BI Monthly Update - October 2024

Check out the October 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors