Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello,
I have a few question about pricing and managing spark settings in Fabric. Right now I use a free trial version, that's why I would like to have a better understanding for the costs before I commit.
Suppose a scenario - I have a simple pipeline which consists of a notebook activity and outlook-activity, which will mail me on-fail.
I want to understand the billing part of my pipeline. The pipeline (and notebook) itself takes usually less that 1 minute to run:
But I have noticed in the monitoring page, there is more than 20 minutes of idle time for the notebook activity:
So my questions isare:
- what will I be billed for - is it only that one minute of running time, or the whole 25 minutes (inclusive the idle time)?
- right now I don't use any programmatic way to terminate the spark capacity in the end of notebook. Is it a good practice to do so, under the condition that I will not be needing it for the rest of the day, and in such case how do I do it? Will it save my cost?
- let's say I would opt for the smallest capacity (F2) when I go live, will I have to modity any of the spark settings in my workspace, or can I leave them as they are by default:
Thanks in advance, I really tried to understand the topic by myself, but I couldn't figure out the answer on my own.
Thank you for the reply! I read and reread the documentation, and according to it I would not be billed for idle time of spark cluster in Microsoft Fabric (unlike in Synapse Analytics):
https://learn.microsoft.com/en-us/fabric/data-engineering/billing-capacity-management-for-spark
Am I missing something?
If I am mistaken and I would still be billed for idle time, how do I stop the capacity programatically from notebooks? According to the documentation to Synapse, I can use mssparkutils.session.stop() to stop Spark in Synapse, but there is no such method in Fabric according to Fabric's documentation.
Hi @johtok ,
Question 1: Runtime vs. Idle Time Billing
Question 2: Is it a good practice to terminate Spark capacity at the end of a notebook if I no longer need to use it that day?
Question 3: If I choose the smallest capacity (F2) when I go live, do I need to change the Spark settings in my workspace or can I leave them at the default settings?
Best Regards,
Adamk Kong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
2 | |
2 | |
2 | |
2 | |
2 |