Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Fabric currently supports custom Environments for PySpark notebooks, but Python notebooks (preview) cannot use them. In Python notebooks, the only available kernels are Python 3.10 and 3.11, and the only way to install packages is via %pip install, which is session‑scoped, not reproducible, and not aligned with Fabric’s managed environment model.
This creates a major gap for teams who want consistent dependency management across both PySpark and Python workflows.
Please add support for:
Custom Python environments (similar to virtual environments)
Attaching those environments to Python notebooks
Managing Python‑only dependencies through the Environment UI
This would bring Python notebooks to parity with PySpark notebooks and enable reliable, repeatable development without %pip workarounds.
In PySpark notebooks with an Environment, this works:
from faker import Fakerfake = Faker()fake.name()
In Python notebooks, it only works after:
%pip install faker
…and must be reinstalled every session.
Enabling custom environments for Python notebooks would solve this cleanly and improve the overall Fabric development experience.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.