March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Databricks ConnectV2 lets you use Visual Studio Code as an IDE to debug Databricks notebooks:
Debug your code and notebooks by using Visual Studio Code | Databricks Blog
Does Microsoft Fabric or Azure Synapse Analytics (ASA) provide similar functionality?
I just noticed in Fabric that there is an "Open in VS Code" button for my notebook. Not sure why I didn't notice it before. When I click that button, my notebook opens in VSCode as a *.ipynb file. I can then step through my program until I get to this line
mssparkutils.fs.ls("./Files")
at which point I get the error:
"NameError: name 'mssparkutils' is not defined"
Hi @Shawn_Eary ,
Thanks for using Fabric Community. Apologies for the delay in reply from our side.
You can refer to this documentation : link
Please install Synapse VS code in Visual Studio and download the necessary software.
Hope this helps. Do let us know if you have any further queries.
Hi @Shawn_Eary ,
Following up to check whether your issue has been resolved. Please let us know if you have any further queries.
The extension was and still is installed, but now I'm getting a different error and I'm not sure why. Now this line:
spark.read.format("csv").option("header", "true").load(f)
I chose Environments -> Python 3.11.6:
Because I didn't know of a Jupyter Server URL that MS Fabric provides.
Is there some way for me to get access to a Spark context and the files in my MS Fabric instance instead of apparently running my Python Notebook locally?
My understanding is that Databricks already has the functionality I'm looking for.
Hi @Shawn_Eary ,
The issue which you are facing is due to the Python environment which you are using. You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.
While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.
Please refer this document for installing the prerequisites : Link
Yes, you can connect to OneLake via Azure Databricks.
Please refer this link for connecting to Databricks: Link2
Hope this helps. Please do let us know if you have any further questions.
Hi @Shawn_Eary ,
Following up to check whether your issue has been resolved. Please let me know if you have any further questions.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.