Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
Shawn_Eary
Advocate IV
Advocate IV

Databricks ConnectV2 for Microsoft Fabric?

Databricks ConnectV2 lets you use Visual Studio Code as an IDE to debug Databricks notebooks:
Debug your code and notebooks by using Visual Studio Code | Databricks Blog

Does Microsoft Fabric or Azure Synapse Analytics (ASA) provide similar functionality?

6 REPLIES 6
Shawn_Eary
Advocate IV
Advocate IV

I just noticed in Fabric that there is an "Open in VS Code" button for my notebook. Not sure why I didn't notice it before. When I click that button, my notebook opens in VSCode as a *.ipynb file. I can then step through my program until I get to this line

mssparkutils.fs.ls("./Files")

at which point I get the error:
"NameError: name 'mssparkutils' is not defined"

Hi @Shawn_Eary ,

Thanks for using Fabric Community. Apologies for the delay in reply from our side. 

You can refer to this documentation : link 
Please install Synapse VS code in Visual Studio and download the necessary software.
Hope this helps. Do let us know if you have any further queries.

vnikhilanmsft_0-1696441114376.png

 

Hi @Shawn_Eary ,
Following up to check whether your issue has been resolved. Please let us know if you have any further queries.

The extension was and still is installed, but now I'm getting a different error and I'm not sure why. Now this line:

 

spark.read.format("csv").option("header", "true").load(f)

 

spits out the error:
NameError: name 'spark' is not defined
 
When I started up the debugger, I was given an option to choose between the Python Environments and Existing Jupyter Server:
Shawn_Eary_0-1696605552721.png

I chose Environments -> Python 3.11.6:

Shawn_Eary_1-1696605651021.png

 Because I didn't know of a Jupyter Server URL that MS Fabric provides.

Is there some way for me to get access to a Spark context and the files in my MS Fabric instance instead of apparently running my Python Notebook locally?

My understanding is that Databricks already has the functionality I'm looking for.

 

Hi @Shawn_Eary ,
The issue which you are facing is due to the Python environment which you are using. You should use the synapse-spark-kernel for your notebook code to run without any error.
I have attached the screenshot for your reference.

vnikhilanmsft_1-1696692739073.png

 


While adding the Synapse VS code extension into your Visual Studio, there are some prerequisites which have to be installed in your local desktop. If you install those prerequisites successfully, then extension will auto create the synapse-spark-kernel and install pyspark and other packages. You will able to find this environment after installing all the prerequisites.

vnikhilanmsft_0-1696692642950.png

 

Please refer this document for installing the prerequisites : Link 

 

Yes, you can connect to OneLake via Azure Databricks.
Please refer this link for connecting to Databricks: Link2 

Hope this helps. Please do let us know if you have any further questions.

Hi @Shawn_Eary ,
Following up to check whether your issue has been resolved. Please let me know if you have any further questions.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.