Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Feature Request: Python notebooks should support custom Environments, similar to PySpark notebooks.

Fabric currently supports custom Environments for PySpark notebooks, but Python notebooks (preview) cannot use them. In Python notebooks, the only available kernels are Python 3.10 and 3.11, and the only way to install packages is via %pip install, which is session‑scoped, not reproducible, and not aligned with Fabric’s managed environment model.

This creates a major gap for teams who want consistent dependency management across both PySpark and Python workflows.

Request

Please add support for:

  • Custom Python environments (similar to virtual environments)

  • Attaching those environments to Python notebooks

  • Managing Python‑only dependencies through the Environment UI

This would bring Python notebooks to parity with PySpark notebooks and enable reliable, repeatable development without %pip workarounds.

Example

In PySpark notebooks with an Environment, this works:

python
from faker import Fakerfake = Faker()fake.name()

In Python notebooks, it only works after:

 
%pip install faker

…and must be reinstalled every session.

Enabling custom environments for Python notebooks would solve this cleanly and improve the overall Fabric development experience.

 
Status: New
Comments
deborshi_nag
Memorable Member
I agree, Environments in Fabric should work for both PySpark and Python notebooks. The concept is very similar to maintaining a virtual environment for Python project, so doesn't explain why it works for PySpark notebooks and not Python.