Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
DanielAmbler
Helper II
Helper II

Debug Spark Job Definition - No Response when selecting debug and Exceeded the amount of requests

Hi,

 

Working my way through a POC and am looking at Visual Studio Code integration, specifically for Spark Job Definitions.  To rule out any issues I am using the Synapse extension (not Docker) natively in Visual Studio Code - it seems to run okay.  If I run 'Check Environment' via command pallet I get this message.

 

'The Synapse PySpark environment preparation is complete! Now, please switch your kernel to start exploring and working with it!'

 

I can then select my kernel

DanielAmbler_0-1736162694758.png

 

I am hitting two issues (the first one has appeared as I write this 🙂 )

 

1.  When refreshing a Workspace, VS Code throws up a dialog "You have exceeded the amount of requests allowed in the current time frame and further requests will fail" - the capacity is underutilised so what exactly is throwing this message?

 

2.  I have a spark job def that simply prints a test line to StdOut - it works fine.  I can run it with success via Visual Studio Code.

 

When coming to select the 'Debug' option, nothing happens, no download is triggered - Ive added the working folder root to my trust settings but still nothing - any ideas or similar experiences to share?

1 ACCEPTED SOLUTION

Hi @v-tsaipranay ,

 

I can confirm Microsoft have fixed the bug in the pre-release Synapse VS Code extension for Visual Studio Code.  I've just finished testing this morning, hence the delayed response.

 

DanielAmbler_0-1738315303383.png

 

View solution in original post

7 REPLIES 7
mathewthewise
Frequent Visitor

I am still encountering this.

 

Having 20+ extensions I dont know of a way to tell if its one of them (most util / azure-wise).
I did some checks, seems notebook autosave was triggering some requests on and on, will see.

 

EDIT:

Few hours without file autosave (VSC setting but can be set per file) did the trick in my case.

DanielAmbler
Helper II
Helper II

Hi @v-tsaipranay , I have a support case open and will update this ticket as soon as I have an update.

Hello @DanielAmbler ,

Could you please confirm if this issue has been resolved? If you have found a solution, kindly share the insights here so that others in the community can easily find them.

 

Thank you.

Hello @DanielAmbler ,

 

May I ask if you have resolved this issue? We apologize for any inconvenience this may have caused and are pleased to hear that the you raised a ticket about your issue.

Your hard work and understanding are greatly appreciated.

 

Thank you.

v-tsaipranay
Community Support
Community Support

Hi @DanielAmbler
Thanks for reaching out to the Microsoft fabric community forum.


The first issue involves exceeding request limits, which can often be resolved by waiting or adjusting usage patterns.

Below troubleshooting steps can resolve your issue:

  • Avoid refreshing your workspace frequently. The changes will appear in due course without the need for constant refreshes.
  • A simple restart can clear temporary request queues. Other extensions might be making requests in the background, contributing to the limit. Temporarily disable some extensions to identify the issue.

The second issue involves the debugging feature not working, which may require checking configurations, trust settings, and extension updates.

Following these steps should assist in resolving these problems:

  • Ensure you have a correctly configured launch.json file within your .vscode folder (create it if it's missing). This file specifies how VS Code should debug your Spark application. It should include information about the Spark application's entry point and any necessary parameters.

Refer to below documentation for proper launch.json configuration for Spark:

https://learn.microsoft.com/en-us/fabric/data-engineering/setup-vs-code-extension#install-the-extens...

  • Review the VS Code output panel (usually accessed through View > Output) for any error messages related to the Synapse extension or your Spark job debugging attempt.
  • The extension could have a bug. Try updating it or reinstalling it. Sometimes, a simple restart can resolve minor software glitches.

For detail information kindly refer the below documentation:

https://learn.microsoft.com/en-us/fabric/data-engineering/author-sjd-with-vs-code#debug-spark-job-de...

 

I hope my suggestions give you good ideas, if you need any further assistance, feel free to reach out.

 

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you.

Hi @v-tsaipranay ,

 

I can confirm Microsoft have fixed the bug in the pre-release Synapse VS Code extension for Visual Studio Code.  I've just finished testing this morning, hence the delayed response.

 

DanielAmbler_0-1738315303383.png

 

Hi @DanielAmbler

We haven't received any updates from you in a while and wanted to follow up to see if you had a chance to review the provided information. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.