Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Salute to the community,
I've just started a project in the Fabric environment.
Several notebooks have been created in Fabric directly from the Internet.
I have used the classic PySpark (Python) environment in these notebooks.
I'm now trying to open a notebook in my local VS Code.
To do this, I followed the doc provided by Windows Fabric, on my machine I already had conda environments, so I just reinstalled Open JDK and installed the Synapse VS Code extension.
Once this was done, I clicked on the button to open the notebook directly in my local VS Code.
The following steps were completed without error: connecting to my Workspace, downloading the notebook and creating the conda Fabric fabric-synapse-runtime-1-1 and fabric-synapse-runtime-1-2 environments.
I attach the correct kernet to the notebook and run the first cell.
By reflex I always check the library versions and I notice a big difference in version.
I still try to run the cell where I'm importing LakeHouse into my variables, but I get an error.
This error below explains that there is a version error between the machine and the server.
I can't find the solution, I've tried to install the version of Spark that Fabric uses without success.
On the right, a capture from my VS Code loca, on the left a capture from the notebook integrated into the Fabric.:
Error encountered:
spark.read.table("LakeHouse_NAME.tbl_TABLE_NAME")
Thank you for your advice and feedback,
I won't hesitate to post the solution here if it comes from an external source.
Solved! Go to Solution.
Hi @VIvMouret ,
This typically happens when the version of Spark used in the local and fabric environments are different.
If the Spark versions of the two enviroments are different, you may need to upgrade or downgrade the local Spark version to match the fabric environment.
Here is a user who encountered a similar issue as you, you can refer to his solution:
Working with remote spark. · Issue #56 · USCDataScience/sparkler · GitHub
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @VIvMouret ,
Is my follow-up just to ask if the problem has been solved?
If so, can you accept the correct answer as a solution or share your solution to help other members find it faster?
Thank you very much for your cooperation!
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @VIvMouret ,
This typically happens when the version of Spark used in the local and fabric environments are different.
If the Spark versions of the two enviroments are different, you may need to upgrade or downgrade the local Spark version to match the fabric environment.
Here is a user who encountered a similar issue as you, you can refer to his solution:
Working with remote spark. · Issue #56 · USCDataScience/sparkler · GitHub
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!