Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Fabric currently supports custom Environments for PySpark notebooks, but Python notebooks (preview) cannot use them. In Python notebooks, the only available kernels are Python 3.10 and 3.11, and the only way to install packages is via %pip install, which is session‑scoped, not reproducible, and not aligned with Fabric’s managed environment model.
This creates a major gap for teams who want consistent dependency management across both PySpark and Python workflows.
Please add support for:
Custom Python environments (similar to virtual environments)
Attaching those environments to Python notebooks
Managing Python‑only dependencies through the Environment UI
This would bring Python notebooks to parity with PySpark notebooks and enable reliable, repeatable development without %pip workarounds.
In PySpark notebooks with an Environment, this works:
from faker import Fakerfake = Faker()fake.name()
In Python notebooks, it only works after:
%pip install faker
…and must be reinstalled every session.
Enabling custom environments for Python notebooks would solve this cleanly and improve the overall Fabric development experience.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.