The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
Working my way through a POC and am looking at Visual Studio Code integration, specifically for Spark Job Definitions. To rule out any issues I am using the Synapse extension (not Docker) natively in Visual Studio Code - it seems to run okay. If I run 'Check Environment' via command pallet I get this message.
'The Synapse PySpark environment preparation is complete! Now, please switch your kernel to start exploring and working with it!'
I can then select my kernel
I am hitting two issues (the first one has appeared as I write this 🙂 )
1. When refreshing a Workspace, VS Code throws up a dialog "You have exceeded the amount of requests allowed in the current time frame and further requests will fail" - the capacity is underutilised so what exactly is throwing this message?
2. I have a spark job def that simply prints a test line to StdOut - it works fine. I can run it with success via Visual Studio Code.
When coming to select the 'Debug' option, nothing happens, no download is triggered - Ive added the working folder root to my trust settings but still nothing - any ideas or similar experiences to share?
Solved! Go to Solution.
Hi @v-tsaipranay ,
I can confirm Microsoft have fixed the bug in the pre-release Synapse VS Code extension for Visual Studio Code. I've just finished testing this morning, hence the delayed response.
I am still encountering this.
Having 20+ extensions I dont know of a way to tell if its one of them (most util / azure-wise).
I did some checks, seems notebook autosave was triggering some requests on and on, will see.
EDIT:
Few hours without file autosave (VSC setting but can be set per file) did the trick in my case.
Hi @v-tsaipranay , I have a support case open and will update this ticket as soon as I have an update.
Hello @DanielAmbler ,
Could you please confirm if this issue has been resolved? If you have found a solution, kindly share the insights here so that others in the community can easily find them.
Thank you.
Hello @DanielAmbler ,
May I ask if you have resolved this issue? We apologize for any inconvenience this may have caused and are pleased to hear that the you raised a ticket about your issue.
Your hard work and understanding are greatly appreciated.
Thank you.
Hi @DanielAmbler,
Thanks for reaching out to the Microsoft fabric community forum.
The first issue involves exceeding request limits, which can often be resolved by waiting or adjusting usage patterns.
Below troubleshooting steps can resolve your issue:
The second issue involves the debugging feature not working, which may require checking configurations, trust settings, and extension updates.
Following these steps should assist in resolving these problems:
Refer to below documentation for proper launch.json configuration for Spark:
For detail information kindly refer the below documentation:
I hope my suggestions give you good ideas, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thank you.
Hi @v-tsaipranay ,
I can confirm Microsoft have fixed the bug in the pre-release Synapse VS Code extension for Visual Studio Code. I've just finished testing this morning, hence the delayed response.
Hi @DanielAmbler,
We haven't received any updates from you in a while and wanted to follow up to see if you had a chance to review the provided information. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.