Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
venkatganesh_96
Regular Visitor

Unable to Publish Libraries in Spark Environment – PbiApiError

I am experiencing an issue while publishing libraries in my Spark environment. I have created a Spark environment and attempted to install and publish a library. However, when I try to save and publish, I encounter the following error:Failed to save environment : PbiApiError: Failed to save environment Failed to save settings: Failed to fetch [ParentActivityId:001f9457-9623-4549-b386-e6c3546124e0] Failed to save library

2 ACCEPTED SOLUTIONS
v-ssriganesh
Community Support
Community Support

Hi @venkatganesh_96,

Thank you for reaching out to the Microsoft fabric community! I appreciate the valuable insights shared by @nilendraFabric.

To add to his response, the PbiApiError when publishing libraries in the Spark environment can often be linked to:

  • Some libraries may not be fully compatible with certain Spark runtime versions. As mentioned, older versions of great-expectations and similar libraries may cause issues. Trying a pre-release version or checking for updates might help.
  • Click "View log" and "View details" in the publishing interface to get more insights into what might be causing the failure. If there are dependency-related issues, you may need to adjust your library versions or configurations.

If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you.

View solution in original post

Hi @venkatganesh_96,
Thanks for trying the suggested steps and getting back to us! Since switching to Spark Runtime 1.2 didn’t resolve the issue, let’s try a few additional troubleshooting steps:

  • Instead of publishing, try installing the libraries first and verify if any errors appear during installation. You can do this by selecting the library, clicking Install, and monitoring the logs.
  • Go to your Spark environment, click on Stop, wait a few moments, and then click Start again before trying to publish. Sometimes, session caching or stale configurations can cause issues.
  • If a specific library is causing the failure, try removing it and publishing again. Are you using any specific Python or R packages? If so, please share the list of installed libraries.
  • Please ensure you have the necessary permissions to modify and publish environments. If this is a shared workspace, confirm that there are no restrictions on publishing libraries.

Additionally, could you share the full error log from the "View log" section? That will help us diagnose the issue more precisely.

I trust this information proves useful. If it does, kindly Accept it as a solution and give it a 'Kudos' to help others locate it easily.
Thank you.

View solution in original post

7 REPLIES 7
v-ssriganesh
Community Support
Community Support

Hi @venkatganesh_96,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.

v-ssriganesh
Community Support
Community Support

Hi @venkatganesh_96,

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

venkatganesh_96
Regular Visitor

i tried to change the spark to 1.2 and if i click save same pdiapierror i am getting

Hi @venkatganesh_96,
Thanks for trying the suggested steps and getting back to us! Since switching to Spark Runtime 1.2 didn’t resolve the issue, let’s try a few additional troubleshooting steps:

  • Instead of publishing, try installing the libraries first and verify if any errors appear during installation. You can do this by selecting the library, clicking Install, and monitoring the logs.
  • Go to your Spark environment, click on Stop, wait a few moments, and then click Start again before trying to publish. Sometimes, session caching or stale configurations can cause issues.
  • If a specific library is causing the failure, try removing it and publishing again. Are you using any specific Python or R packages? If so, please share the list of installed libraries.
  • Please ensure you have the necessary permissions to modify and publish environments. If this is a shared workspace, confirm that there are no restrictions on publishing libraries.

Additionally, could you share the full error log from the "View log" section? That will help us diagnose the issue more precisely.

I trust this information proves useful. If it does, kindly Accept it as a solution and give it a 'Kudos' to help others locate it easily.
Thank you.

Hi @venkatganesh_96,
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.
Thank you.

v-ssriganesh
Community Support
Community Support

Hi @venkatganesh_96,

Thank you for reaching out to the Microsoft fabric community! I appreciate the valuable insights shared by @nilendraFabric.

To add to his response, the PbiApiError when publishing libraries in the Spark environment can often be linked to:

  • Some libraries may not be fully compatible with certain Spark runtime versions. As mentioned, older versions of great-expectations and similar libraries may cause issues. Trying a pre-release version or checking for updates might help.
  • Click "View log" and "View details" in the publishing interface to get more insights into what might be causing the failure. If there are dependency-related issues, you may need to adjust your library versions or configurations.

If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you.

nilendraFabric
Super User
Super User

Hello @venkatganesh_96 

 

I have faced the similar issue link attached

 

If the libraries or configurations in the environment are incompatible with the selected Spark runtime version, publishing will fail. For example, older versions of libraries like `great-expectations` have been reported to cause publishing errors unless pre-release versions are used


Known issues with specific runtime versions (e.g., Fabric Runtime 1.3) can cause library installation or publishing failures due to backend updates or versioning conflicts

 

https://community.fabric.microsoft.com/t5/Data-Engineering/Environment-with-Runtime-1-3-fails-when-u...

Review error logs for more details on the failure by clicking “View log” and “View details” in the publishing interface. This can help identify specific issues with dependencies or configurations

 

if this helps please accept the solution and give kudos 

 

 

 

 

Unable to upload R library(Tar.gz format) in Fabric Environment 

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.