Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi all, I am trying to develop a fabric notebook in a local vscode environment. In the documentation it says the following:
"By selecting the kernel synapse-spark-kernel shipped with this extension, you can run the code cell on top of the remote Fabric Spark compute. Once you have selected this kernel, during runtime, the extension intercepts all the PySpark API calls and translates them to the corresponding http call to the remote Spark compute. For pure Python code, it's still executed in the local environment."
In my IDE however, I do not see any option to select this kernel. I also do not find any more related information on this online.
How can I install the remote spark kernel? Would be very benificial for local development, rather than from the data factory UI.
I am on windows 10 by the way.
Hope you can help.
Kind Regards,
Kjell
Solved! Go to Solution.
Hi @kjellvs ,
Thanks for using Fabric Community.
You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Maybe you have installed conda, but forgot to set the environment variable.
When you activate the extension, it will show the initialization progress and output in output channel like this:
Please refer this document for installing the prerequisites : Link1
Hope this helps. Please let us know if you have any further queries.
@v-nikhilan-msft thank you for your swift reply.
I missed the prerequisites, thats likely why it doesn't work indeed. I think the link you meant is this one: link2.
It works now thanks!
Thank you for following up,
Kjell
Hi @kjellvs ,
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
Hi @kjellvs ,
Thanks for using Fabric Community.
You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Maybe you have installed conda, but forgot to set the environment variable.
When you activate the extension, it will show the initialization progress and output in output channel like this:
Please refer this document for installing the prerequisites : Link1
Hope this helps. Please let us know if you have any further queries.
User | Count |
---|---|
37 | |
7 | |
4 | |
3 | |
1 |
User | Count |
---|---|
52 | |
15 | |
13 | |
10 | |
9 |