Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
SparK Cost
I am currently using the Fabric trail version which has the F64 capacity. I have used the Fabric Metrics Dashboard and found the CU usage for my spark streaming notebook named "EventStreaming". How much will be the cost for the CU(s) & duration (s), i have utilized for this compute? Also, I would like to know whether using spark streaming notebooks will incur us more cost than mentioned in the Fabric Capacity costs?
Solved! Go to Solution.
Hi @_augustine_ ,
The duration is approximately 64,299.78 seconds, which is equivalent to approximately 17.86 hours.
The cost per hour for F64 capacity is $11.52.
Let’s calculate the total cost: Total cost=Cost per hour×Duration (hours)=11.52×17.86=$205.76
Regarding your concern about whether using Spark streaming notebooks incurs more cost than the mentioned Fabric Capacity costs, it's essential to note that Spark pools are billed based on active Spark session duration. You're not billed for the time taken to acquire cluster instances from the cloud or for the time taken for initializing the Spark context. This means that the cost is directly related to the compute resources utilized during the active session of your Spark jobs.
For detailed guidance on managing and optimizing Spark compute costs, I recommend reviewing the following documentation:
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @_augustine_ ,
The duration is approximately 64,299.78 seconds, which is equivalent to approximately 17.86 hours.
The cost per hour for F64 capacity is $11.52.
Let’s calculate the total cost: Total cost=Cost per hour×Duration (hours)=11.52×17.86=$205.76
Regarding your concern about whether using Spark streaming notebooks incurs more cost than the mentioned Fabric Capacity costs, it's essential to note that Spark pools are billed based on active Spark session duration. You're not billed for the time taken to acquire cluster instances from the cloud or for the time taken for initializing the Spark context. This means that the cost is directly related to the compute resources utilized during the active session of your Spark jobs.
For detailed guidance on managing and optimizing Spark compute costs, I recommend reviewing the following documentation:
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.
Check out the September 2025 Fabric update to learn about new features.