Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
let us say I have created a new Environment named Environment_A for all the different Python packages I need for the PySpark script to run inside Workspace_A. Can this Environment_A be used for another PySpark script running in Workspace_B? I find at times, it works, and other times, it gets disconnected instantly.
The question then is how can I create a new Environment to be used in different workspaces for different PySpark scripts? You know create a new environment with the different Python packages takes time, and we do not want to duplicate the same environment all over the different workspaces.
What is the best solution then?
Everything I need those identical libraries to be used in other workspaces, I need to create another environment in the workspace, and the creation of new environment takes a long time. Can Microsoft looks into how to have a better solution to this?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.