Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi all, I am trying to develop a fabric notebook in a local vscode environment. In the documentation it says the following:
"By selecting the kernel synapse-spark-kernel shipped with this extension, you can run the code cell on top of the remote Fabric Spark compute. Once you have selected this kernel, during runtime, the extension intercepts all the PySpark API calls and translates them to the corresponding http call to the remote Spark compute. For pure Python code, it's still executed in the local environment."
In my IDE however, I do not see any option to select this kernel. I also do not find any more related information on this online.
How can I install the remote spark kernel? Would be very benificial for local development, rather than from the data factory UI.
I am on windows 10 by the way.
Hope you can help.
Kind Regards,
Kjell
Solved! Go to Solution.
Hi @kjellvs ,
Thanks for using Fabric Community.
You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Maybe you have installed conda, but forgot to set the environment variable.
When you activate the extension, it will show the initialization progress and output in output channel like this:
Please refer this document for installing the prerequisites : Link1
Hope this helps. Please let us know if you have any further queries.
@Anonymous thank you for your swift reply.
I missed the prerequisites, thats likely why it doesn't work indeed. I think the link you meant is this one: link2.
It works now thanks!
Thank you for following up,
Kjell
Hi @kjellvs ,
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
Hi @kjellvs ,
Thanks for using Fabric Community.
You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Maybe you have installed conda, but forgot to set the environment variable.
When you activate the extension, it will show the initialization progress and output in output channel like this:
Please refer this document for installing the prerequisites : Link1
Hope this helps. Please let us know if you have any further queries.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
12 | |
4 | |
3 | |
3 | |
3 |
User | Count |
---|---|
8 | |
6 | |
6 | |
5 | |
5 |