Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Jeanxyz
Impactful Individual
Impactful Individual

speed up spark computation

I have used Fabric Notebook before with the SKU2 capacity, it seems to be ok. But I have now purchased Fabric SKU2 with my personal account, the notebook seems to be very slow. E.g., I tried to install pdfplumber package to a new environment, it takes more than 10 minutes to publish the package. 

I try to optimize spark setting in Workspace setting, but it won't allow me to change the autoscale setting . I tried to change spark setting in my custom environment, that doesn't work either.

 

Screenshot 2025-05-19 122303.png

Screenshot 2025-05-19 121539.png

 

1 ACCEPTED SOLUTION
v-pagayam-msft
Community Support
Community Support

Hi @Jeanxyz ,
Thank you for reaching out to us on Microsoft Fabric Community Forum!

 Even though you are using SKU2, performance can vary between organizational and personal environments due to differences in how Spark resources are provisioned. Slow package installs like pdfplumber are common if the environment is initializing or the pool is small. To improve this, consider reusing the environment after the first install and increasing the number of nodes if your SKU allows.
Regarding the autoscale settings being greyed out in both the workspace and environment, this happens if the pool is already in use or was created without autoscale enabled. To resolve this, create a new Spark pool with autoscale configured during setup, then assign it to your environment. 
Refer the links here:
Stock Keeping Unit (SKU) considerations - Microsoft Fabric | Microsoft Learn
Understand Autoscale compute for Spark detail page - Microsoft Fabric | Microsoft Learn


I hope this resolve your query. If so,give us kudos and consider accepting it as solution.

Regards,
Pallavi.

View solution in original post

2 REPLIES 2
Jeanxyz
Impactful Individual
Impactful Individual

Thanks, I have actived autoscale and the problem is solved. However, just for your info that the Fabric is still very slow. I have published the same PB report to two workspaces: a Fabric workspace with SKU2 and a pro workspace. It takes 30 mins to refresh the dataset in Fabric workspace, in pro workspace, it takes 12 minutes only. So the conclusion is: if the semantic model is import mode, use data source from Fabric will slow the import speed unless you have a large Fabric capacit. 

v-pagayam-msft
Community Support
Community Support

Hi @Jeanxyz ,
Thank you for reaching out to us on Microsoft Fabric Community Forum!

 Even though you are using SKU2, performance can vary between organizational and personal environments due to differences in how Spark resources are provisioned. Slow package installs like pdfplumber are common if the environment is initializing or the pool is small. To improve this, consider reusing the environment after the first install and increasing the number of nodes if your SKU allows.
Regarding the autoscale settings being greyed out in both the workspace and environment, this happens if the pool is already in use or was created without autoscale enabled. To resolve this, create a new Spark pool with autoscale configured during setup, then assign it to your environment. 
Refer the links here:
Stock Keeping Unit (SKU) considerations - Microsoft Fabric | Microsoft Learn
Understand Autoscale compute for Spark detail page - Microsoft Fabric | Microsoft Learn


I hope this resolve your query. If so,give us kudos and consider accepting it as solution.

Regards,
Pallavi.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.