Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi all, I am trying to develop a fabric notebook in a local vscode environment. In the documentation it says the following:
"By selecting the kernel synapse-spark-kernel shipped with this extension, you can run the code cell on top of the remote Fabric Spark compute. Once you have selected this kernel, during runtime, the extension intercepts all the PySpark API calls and translates them to the corresponding http call to the remote Spark compute. For pure Python code, it's still executed in the local environment."
In my IDE however, I do not see any option to select this kernel. I also do not find any more related information on this online.
How can I install the remote spark kernel? Would be very benificial for local development, rather than from the data factory UI.
I am on windows 10 by the way.
Hope you can help.
Kind Regards,
Kjell
Solved! Go to Solution.
Hi @kjellvs ,
Thanks for using Fabric Community.
You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Maybe you have installed conda, but forgot to set the environment variable.
When you activate the extension, it will show the initialization progress and output in output channel like this:
Please refer this document for installing the prerequisites : Link1
Hope this helps. Please let us know if you have any further queries.
@Anonymous thank you for your swift reply.
I missed the prerequisites, thats likely why it doesn't work indeed. I think the link you meant is this one: link2.
It works now thanks!
Thank you for following up,
Kjell
Hi @kjellvs ,
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
Hi @kjellvs ,
Thanks for using Fabric Community.
You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Maybe you have installed conda, but forgot to set the environment variable.
When you activate the extension, it will show the initialization progress and output in output channel like this:
Please refer this document for installing the prerequisites : Link1
Hope this helps. Please let us know if you have any further queries.