Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Environment for Python packages to be used by different PySpark scripts in different Workspaces

let us say I have created a new Environment named Environment_A for all the different Python packages I need for the PySpark script to run inside Workspace_A. Can this Environment_A be used for another PySpark script running in Workspace_B? I find at times, it works, and other times, it gets disconnected instantly.

 

The question then is how can I create a new Environment to be used in different workspaces for different PySpark scripts? You know create a new environment with the different Python packages takes time, and we do not want to duplicate the same environment all over the different workspaces.

 

What is the best solution then?

 

Everything I need those identical libraries to be used in other workspaces, I need to create another environment in the workspace, and the creation of new environment takes a long time. Can Microsoft looks into how to have a better solution to this?

Status: New