Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.

Add ability to import normal python files / modules to notebooks

Please add the ability to import normal python files (called modules in python lingo) in a notebook. Like this:

 

import my_normal_python_file

my_normal_python_file.my_reusable_function()

 

This is normal python functionality but it's currently impossible when working with notebooks in fabric. Since we can only create notebooks and not normal python files, we have the option to use:

 

%run another_notebook

or

notebookutils.notebook.run("another_notebook")

 

Neither of these alternatives puts the functions inside "another_notebook" in a nice namespace like the regular python import statement does. 

In databricks you can create regular python files and the import statement works as you expect. It would be nice to have the same functionality in fabric.

 

Please be aware of the distinction between packages and modules. There is already capability to install and import packages, but python packages are cumbersome to work with during heavy development. The import statement combined with regular python files (aka modules) is just right for a lot of use cases. 

Status: New
Comments
SJCuthbertson
Advocate I
Crucially, for this to be useful, those normal python files HAVE to be part of the Fabric workspace git integration, just like notebooks are, and they HAVE to be supported by Fabric deployment pipelines.
JohanRex
Regular Visitor
Good clarification. Just look at what databricks have done and implement the same. Plus deployment pipeline support of course.
effdfdfd
Regular Visitor
This is critical!
skell
New Member
This is indeed critical. Without this feature reusing code in Fabric is very cumbersome and crucially limited to PySpark notebooks which consume a lot of capacity! Most of us need to be able to develop using pure Python kernels and dont actually need spark!
ex_kjetilh
Regular Visitor
This is critical!
SimonHommel
New Member
Currently we have to do bad workarounds for this. E.g. create a python file in a lakehouse and import it into the notebook from there. However, this is not git integrated and it is non-natural to develop code in a lakehouse. It would be great if we can create .py files in a workspace on the same level on which we can create notebooks. For me this is also a critical feature. Databricks has this for more than 2 years and in jupyterlab you can also work with ordinary text files.