March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi all, I am trying to develop a fabric notebook in a local vscode environment. In the documentation it says the following:
"By selecting the kernel synapse-spark-kernel shipped with this extension, you can run the code cell on top of the remote Fabric Spark compute. Once you have selected this kernel, during runtime, the extension intercepts all the PySpark API calls and translates them to the corresponding http call to the remote Spark compute. For pure Python code, it's still executed in the local environment."
In my IDE however, I do not see any option to select this kernel. I also do not find any more related information on this online.
How can I install the remote spark kernel? Would be very benificial for local development, rather than from the data factory UI.
I am on windows 10 by the way.
Hope you can help.
Kind Regards,
Kjell
Solved! Go to Solution.
Hi @kjellvs ,
Thanks for using Fabric Community.
You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Maybe you have installed conda, but forgot to set the environment variable.
When you activate the extension, it will show the initialization progress and output in output channel like this:
Please refer this document for installing the prerequisites : Link1
Hope this helps. Please let us know if you have any further queries.
@v-nikhilan-msft thank you for your swift reply.
I missed the prerequisites, thats likely why it doesn't work indeed. I think the link you meant is this one: link2.
It works now thanks!
Thank you for following up,
Kjell
Hi @kjellvs ,
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
Hi @kjellvs ,
Thanks for using Fabric Community.
You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Maybe you have installed conda, but forgot to set the environment variable.
When you activate the extension, it will show the initialization progress and output in output channel like this:
Please refer this document for installing the prerequisites : Link1
Hope this helps. Please let us know if you have any further queries.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
6 | |
2 | |
1 | |
1 | |
1 |