Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
Ali_Cruz
New Member

How can I kill a spark query that has been running for a long period of time in Fabric.

I found a query in spark UI that has been running for 20 hours but I could not find the way to kill it the query is not showing any Running Job IDs and the previous job assosiated with the query has FAILED status. Here is the query that I saw in Spark UI

Ali_Cruz_0-1750800236405.png
This is the job associated with the last Sub Execution IDs

Ali_Cruz_1-1750800331026.png

Does any body experience a similar situation before? Is there a way to kill the query that is still running in Spark?



 






1 REPLY 1
nilendraFabric
Community Champion
Community Champion

Hi @Ali_Cruz 

 

Did you checked in monitoring hub?

 

Navigate to the Monitoring hub in your Fabric workspace.
2. Locate the activity that’s been running for an extended period (e.g., your 20-hour query).
3. Click “Cancel” next to the activity and confirm the action.
4. Wait 2–3 minutes for the system to terminate the process.

 

No visible Job IDs? This indicates a zombie state. Focus on the Monitoring hub method above

 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.