Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
Dominoes2
Frequent Visitor

Python Notebooks (not Pyspark) having default package issues

I have been working on a notebook today, went for lunch, came back and there are some pretty big changes that seem to have occured (Around 13:00 +10 😞

 

- Since then connecting to a session takes 60 seconds instead of 2 seconds in the morning

- Display() does not seem to work and fails silently

- Some packages that used to be in the environment now need to be installed with pip to be accessible such as deltalake

 

Is this all expected ? 

4 REPLIES 4
Anonymous
Not applicable

HI @Dominoes2,

I'd like to suggest you check the default environment and spark pool settings if these settings changes. In addition, you could also check the fabric capacity SKU level if it reduced to a smaller level.

Create, configure, and use an environment in Fabric - Microsoft Fabric | Microsoft Learn

What is Power BI Premium? - Power BI | Microsoft Learn

Regards,

Xiaoxin Sheng

Hi @Anonymous 

 

From what I can tell vanilla python notebooks do not support environments, only pyspark notebooks. 

Use Python experience on Notebook - Microsoft Fabric | Microsoft Learn

 

The docs do say that for packages to work you may need to restart the kernal. This works when running it manually but how is this expected to work when running it on schedule? is there a programatic way to restart the kernal?

Dominoes2_0-1733793830038.png

 

Anonymous
Not applicable

Hi @Dominoes2,

Perhaps you can tried to import the OS library in the notebooks and use its methods to operate with the kernel:

jupyter - Restart ipython Kernel with a command from a cell - Stack Overflow

Regards,

Xiaoxin Sheng

Hello, guys; thank you for opening this conversation. I am facing the same situation here. I want to create a Python environment with some libraries. There is a simple code that I want to run, and I don't need Spark, just a Python environment. Do you know if we have another way to create an environment with some libraries that we need and are not in the available Python environment?

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors