Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
PrachiJain_2025
Advocate I
Advocate I

Notebook : Increased CU Consumption for Notebook Pipeline

Notebook : Increased CU Consumption for Notebook Pipeline

Hello Team,

We are using a notebook to fetch API data every 10 minutes. The notebook is scheduled using a pipeline and fetching data every 10 minutes, working fine and consuming fewer CUs until 2025-04-14.

You can see in the screenshot below for the date 2025-04-14 - Time Point Detail.

 

CU Consumption was:

PrachiJain_2025_4-1745582015534.png

 

 

 

 

Now, referring to the screenshot below for the date 2025-04-23 - Time Point Detail:

 

PrachiJain_2025_3-1745581959237.png

 

 

 

 

We’ve noticed that CU consumption has increased. Our job remains the same, and we haven’t made any changes. The data volume is consistent with the source, so there hasn't been a significant change in the amount of data being processed.

 

Could you help us understand why CU consumption is increasing?
How can we trace the increased CU consumption and control it? We are currently using notebook logs,  Are there any additional steps or tools we can use to monitor and manage CU consumption more effectively?

 

 

Thank you

 

1 ACCEPTED SOLUTION
v-karpurapud
Community Support
Community Support

Hi @PrachiJain_2025 
Thank you for reaching out to the Microsoft Fabric Community Forum.

 

We understand you are experiencing an issue with the increase of CU during the execution of a Notebook pipeline. Notebooks consume CUs based on the compute resources (e.g., CPU, memory) used during execution. If the notebook takes longer to run or uses more resources, CU consumption increases. This may be due to increased API response times or heavier data processing extending runtime.

 

Microsoft Fabric periodically updates its runtime environments (e.g., Spark runtime, Python libraries). An could have introduced changes affecting resource usage, such as a less optimized Spark version or updated dependencies.

 

To determine the reason for the increased CU consumption, please follow these steps using Microsoft Fabric’s monitoring tools and logs:

 

Use the Microsoft Fabric Capacity Metrics app to monitor CU usage across all workloads in the capacity.

Access the Microsoft Fabric portal, open the Capacity Metrics app, filter by your Notebook Pipeline, set the date range to before and after 2025-04-14, and compare CU consumption trends. Check for any spikes or other heavy workloads running at the same time that could impact CU usage.

Pricing for data pipelines - Microsoft Fabric | Microsoft Learn
What is the Microsoft Fabric Capacity Metrics app? - Microsoft Fabric | Microsoft Learn

 

Independently test the API using Postman or a Python script, measure current response times, compare them with historical performance, and verify with the API provider if any updates, rate limits, or throttling changes were introduced around 2025-04-14.

 

In the Fabric workspace, check the Spark runtime version and configurations (like spark.executor.memory and cluster settings) used by the notebook, compare them with the versions and settings before 2025-04-14, and review Fabric release notes for any runtime updates or changes.

Apache Spark runtime lifecycle in Fabric - Microsoft Fabric | Microsoft Learn

If this response resolves your query, please mark it as the Accepted Solution to assist other community members. A Kudos is also appreciated if you found the response helpful.

 

Thank you!

View solution in original post

4 REPLIES 4
v-karpurapud
Community Support
Community Support

Hi @PrachiJain_2025 

We have not received a response from you regarding the query and were following up to check if you have found a resolution from the information provided below. If you find the response helpful, please mark it as the accepted solution and provide kudos, as this will help other members with similar queries.

Thank You!

v-karpurapud
Community Support
Community Support

Hi @PrachiJain_2025 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.

Thank you.

v-karpurapud
Community Support
Community Support

Hi @PrachiJain_2025 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

v-karpurapud
Community Support
Community Support

Hi @PrachiJain_2025 
Thank you for reaching out to the Microsoft Fabric Community Forum.

 

We understand you are experiencing an issue with the increase of CU during the execution of a Notebook pipeline. Notebooks consume CUs based on the compute resources (e.g., CPU, memory) used during execution. If the notebook takes longer to run or uses more resources, CU consumption increases. This may be due to increased API response times or heavier data processing extending runtime.

 

Microsoft Fabric periodically updates its runtime environments (e.g., Spark runtime, Python libraries). An could have introduced changes affecting resource usage, such as a less optimized Spark version or updated dependencies.

 

To determine the reason for the increased CU consumption, please follow these steps using Microsoft Fabric’s monitoring tools and logs:

 

Use the Microsoft Fabric Capacity Metrics app to monitor CU usage across all workloads in the capacity.

Access the Microsoft Fabric portal, open the Capacity Metrics app, filter by your Notebook Pipeline, set the date range to before and after 2025-04-14, and compare CU consumption trends. Check for any spikes or other heavy workloads running at the same time that could impact CU usage.

Pricing for data pipelines - Microsoft Fabric | Microsoft Learn
What is the Microsoft Fabric Capacity Metrics app? - Microsoft Fabric | Microsoft Learn

 

Independently test the API using Postman or a Python script, measure current response times, compare them with historical performance, and verify with the API provider if any updates, rate limits, or throttling changes were introduced around 2025-04-14.

 

In the Fabric workspace, check the Spark runtime version and configurations (like spark.executor.memory and cluster settings) used by the notebook, compare them with the versions and settings before 2025-04-14, and review Fabric release notes for any runtime updates or changes.

Apache Spark runtime lifecycle in Fabric - Microsoft Fabric | Microsoft Learn

If this response resolves your query, please mark it as the Accepted Solution to assist other community members. A Kudos is also appreciated if you found the response helpful.

 

Thank you!

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors