Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
Check it out now!Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
SparK Cost
I am currently using the Fabric trail version which has the F64 capacity. I have used the Fabric Metrics Dashboard and found the CU usage for my spark streaming notebook named "EventStreaming". How much will be the cost for the CU(s) & duration (s), i have utilized for this compute? Also, I would like to know whether using spark streaming notebooks will incur us more cost than mentioned in the Fabric Capacity costs?
Solved! Go to Solution.
Hi @_augustine_ ,
The duration is approximately 64,299.78 seconds, which is equivalent to approximately 17.86 hours.
The cost per hour for F64 capacity is $11.52.
Let’s calculate the total cost: Total cost=Cost per hour×Duration (hours)=11.52×17.86=$205.76
Regarding your concern about whether using Spark streaming notebooks incurs more cost than the mentioned Fabric Capacity costs, it's essential to note that Spark pools are billed based on active Spark session duration. You're not billed for the time taken to acquire cluster instances from the cloud or for the time taken for initializing the Spark context. This means that the cost is directly related to the compute resources utilized during the active session of your Spark jobs.
For detailed guidance on managing and optimizing Spark compute costs, I recommend reviewing the following documentation:
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @_augustine_ ,
The duration is approximately 64,299.78 seconds, which is equivalent to approximately 17.86 hours.
The cost per hour for F64 capacity is $11.52.
Let’s calculate the total cost: Total cost=Cost per hour×Duration (hours)=11.52×17.86=$205.76
Regarding your concern about whether using Spark streaming notebooks incurs more cost than the mentioned Fabric Capacity costs, it's essential to note that Spark pools are billed based on active Spark session duration. You're not billed for the time taken to acquire cluster instances from the cloud or for the time taken for initializing the Spark context. This means that the cost is directly related to the compute resources utilized during the active session of your Spark jobs.
For detailed guidance on managing and optimizing Spark compute costs, I recommend reviewing the following documentation:
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.