Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
johtok
Regular Visitor

Help to understand the billing and spark capacity

Hello,

I have a few question about pricing and managing spark settings in Fabric. Right now I use a free trial version, that's why I would like to have a better understanding for the costs before I commit.

 

Suppose a scenario - I have a simple pipeline which consists of a notebook activity and outlook-activity, which will mail me on-fail.

 

I want to understand the billing part of my pipeline. The pipeline (and notebook) itself takes usually less that 1 minute to run:

pipeline.JPG

 But I have noticed in the monitoring page, there is more than 20 minutes of idle time for the notebook activity:

notebook.JPG

So my questions isare:

- what will I be billed for - is it only that one minute of running time, or the whole 25 minutes (inclusive the idle time)? 

- right now I don't use any programmatic way to terminate the spark capacity in the end of notebook. Is it a good practice to do so, under the condition that I will not be needing it for the rest of the day, and in such case how do I do it? Will it save my cost?

- let's say I would opt for the smallest capacity (F2) when I go live, will I have to modity any of the spark settings in my workspace, or can I leave them as they are by default:

spark.JPG

 

 

Thanks in advance, I really tried to understand the topic by myself, but I couldn't figure out the answer on my own.

2 REPLIES 2
johtok
Regular Visitor

Thank you for the reply! I read and reread the documentation, and according to it I would not be billed for idle time of spark cluster in Microsoft Fabric (unlike in Synapse Analytics):

https://learn.microsoft.com/en-us/fabric/data-engineering/billing-capacity-management-for-spark

doc.JPG

 

Am I missing something?

 

If I am mistaken and I would still be billed for idle time, how do I stop the capacity programatically from notebooks? According to the documentation to Synapse, I can use mssparkutils.session.stop() to stop Spark in Synapse, but there is no such method in Fabric according to Fabric's documentation.

 

Anonymous
Not applicable

Hi @johtok ,

 

Question 1: Runtime vs. Idle Time Billing

  • Billing is based on the total time the Spark cluster is running, including active processing time and idle time.

Question 2: Is it a good practice to terminate Spark capacity at the end of a notebook if I no longer need to use it that day?

  • Yes, it is a good practice to terminate Spark Capacity if you no longer need to use it.
    This can significantly reduce costs because billing is based on the total runtime of the Spark cluster.

Question 3: If I choose the smallest capacity (F2) when I go live, do I need to change the Spark settings in my workspace or can I leave them at the default settings?

  • When choosing the minimum capacity (F2), you can usually leave the default settings, but you will need to monitor and adjust them to optimize performance and cost.

 

Best Regards,
Adamk Kong

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.