Fabric currently supports custom Environments for PySpark notebooks, but Python notebooks (preview) cannot use them. In Python notebooks, the only available kernels are Python 3.10 and 3.11, and the only way to install packages is via %pip install, which is session‑scoped, not reproducible, and not aligned with Fabric’s managed environment model. This creates a major gap for teams who want consistent dependency management across both PySpark and Python workflows. Request Please add support for: Custom Python environments (similar to virtual environments) Attaching those environments to Python notebooks Managing Python‑only dependencies through the Environment UI This would bring Python notebooks to parity with PySpark notebooks and enable reliable, repeatable development without %pip workarounds. Example In PySpark notebooks with an Environment, this works: python from faker import Fakerfake = Faker()fake.name() In Python notebooks, it only works after: %pip install faker …and must be reinstalled every session. Enabling custom environments for Python notebooks would solve this cleanly and improve the overall Fabric development experience.
... View more