Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I have numerous notebooks in which I utilize Python, and specifically the Pandas library for data transformations. Today I started to receive errors on the runs of multiple notebooks across different workspaces when running the line of code that imports the pandas module. Message: 'from fsspec_wrapper.trident.core import OneLakeFileSystem -> ModuleNotFoundError: No module named 'fsspec_wrapper.trident.core'. This error specifically occurs only when the notebook environment is a custom environment rather than "Workspace Default". These notebooks have ran for months with no issue and no manual changes were made to the custom environment, and just started to throw this error today. I tried adding the newest fsspec library (2/1/2025 version) from pypl using 'public library' into my custom environment, and importing fsspec prior to pandas in the notebook.. still got the same error. I have not been able to find a solution, [other than for all notebooks throwing this error, changing the environment from a custom one to 'Workspace default'- this solves the issue]. But sometimes the packages I have to use in a notebook are not included in the default workspace libraries, and I will have to use a custom environment and pandas- how can I solve this issue? Thanks so much!
Solved! Go to Solution.
This issue seems to stem from a recent change in how Fabric handles internal dependencies—particularly around the fsspec_wrapper.trident.core module, which is now required when importing pandas in custom environments.
The error:
ModuleNotFoundError: No module named 'fsspec_wrapper.trident.core'
typically occurs when using custom environments in Fabric notebooks. It appears that the pandas import is now indirectly dependent on internal Fabric components that are not bundled in custom environments by default.
Switch to the Workspace Default Environment
This resolves the issue for many users, but it’s not always viable if your custom environment includes packages not available in the default one.
Force Re-Publish the Custom Environment
As suggested by Microsoft staff, make any change to your custom environment (e.g., add a dummy package like pytest from PyPI), save, and redeploy. This forces a refresh of the environment and often resolves the issue [1].
Manually Add fsspec from Public Libraries
Some users had success by adding the latest version of fsspec (e.g., from February 2025) via the public library tab. However, this alone may not always fix the issue if the internal wrapper is still missing.
Upgrade the Runtime Version
Upgrading your custom environment’s runtime (e.g., from 1.2 to 1.3) may help, as it updates the Spark version and potentially resolves dependency mismatches.
This seems to be a platform-level dependency issue, and while workarounds exist, it may reoccur with future updates. If you're running production workloads, consider:
This issue seems to stem from a recent change in how Fabric handles internal dependencies—particularly around the fsspec_wrapper.trident.core module, which is now required when importing pandas in custom environments.
The error:
ModuleNotFoundError: No module named 'fsspec_wrapper.trident.core'
typically occurs when using custom environments in Fabric notebooks. It appears that the pandas import is now indirectly dependent on internal Fabric components that are not bundled in custom environments by default.
Switch to the Workspace Default Environment
This resolves the issue for many users, but it’s not always viable if your custom environment includes packages not available in the default one.
Force Re-Publish the Custom Environment
As suggested by Microsoft staff, make any change to your custom environment (e.g., add a dummy package like pytest from PyPI), save, and redeploy. This forces a refresh of the environment and often resolves the issue [1].
Manually Add fsspec from Public Libraries
Some users had success by adding the latest version of fsspec (e.g., from February 2025) via the public library tab. However, this alone may not always fix the issue if the internal wrapper is still missing.
Upgrade the Runtime Version
Upgrading your custom environment’s runtime (e.g., from 1.2 to 1.3) may help, as it updates the Spark version and potentially resolves dependency mismatches.
This seems to be a platform-level dependency issue, and while workarounds exist, it may reoccur with future updates. If you're running production workloads, consider:
Using the latest runtime could solve this issue if installing the fsspec library alone didn't resolve it.
I posted the mitigation below
I'm also starting to experience this same issue as of last night 3/13/25.
Having "fixed" the issue this morning across multiple workspaces, we've just had it re-occur again a few hours later across all the same workspaces 😫.
Our resolution path (presuming it doesn't occur again) was:
1. Upgrade Runtime of our custom environment from 1.2 to 1.3, which upgrades Spark from 3.4 to 3.5. That seemed to work for about a day or so, but then it broke again. Switching to the Default Environment didn't help, that was broken too.
2. Although fsspec was in the Built-in libraries, we switched to the Public libraries and added fsspec from PyPl. This brought in a newer version of fsspec that was in the Built-in libraries, and that has been working for about 3-4 days so far with no problem.
It's a really frustrating issue and I suspect that a lot more people are going through it. The Microsoft Learn article that was posted earlier was interesting, but to me this was more of a total break on Microsoft's part and should have been treated as such.
Yeah... issue continue to happen here...
Having the same issue here, specifically when notebook tries to import pandas package.
The workaround is to force the environment to republish. This will fix the problem until the next time internal core libraries in Fabric are changing. In order to publish _any_ change must be made to the environment. It does not matter what you're doing. I tested by adding the public "pytest" package from PyPi, saved, and deployed the environment and the problem was gone.
Are you on Runtime 1.2 or 1.3?
We've already implemented the workaround, but it remains a workaround. My concern is if this kind scenario is likely to be a recurring thing, and if there's anything we could/should be doing to mitigate against it in a production environment? At least one person above has indicated the issue has re-occurred for them, and required fixing with a workaround for a second time this week.
This issue has hit us overnight, impacting multiple business areas and scheduled pipelines.
Is there some way to mitigate against platform side changes like this on the customer side?
We are also having this issue. Trying the deliberate install now.
Issue just came back for me after I tried to deliberately install the package from the public repo. Upgrading the runtime now to see if that fixes it more permanently. This is breaking some of my most important pipelines and requiring that I wait for what is already an unacceptably long environment publish time before re-running them.
The issue has not returned for me since also upgrading the runtime
Annoyingly this issue has returned for me this morning in North Europe. Is this occuring for anybody else? Notebook in dev workspace was ok, the same notebook in test and production was failing. The fix is the same as before - force a republishing of the custom environment by making a change e.g. adding/removing an unnecessary library like "pytest" or changing the runtime version.
We had the same issue, it got solved after installing fsspec module from public libraries in our custom environment
Microsoft FTE here. We're seeing the same issue. It has messed up all of our reporting workspaces, again.
Support, feel free to ping me internally on Teams to see if we can find more details.
Hello @tarainfotech ,
We noticed we haven't received a response from you yet, so we wanted to follow up and ensure the solution we provided addressed your issue. If you require any further assistance or have additional questions, please let us know.
Your feedback is valuable to us, and we look forward to hearing from you soon.
The issue can be fixed by removing the libraries from the environment and adding them again.
In fact any change of libraries on the environment will likely fix the issue, and the reason is that this will bring the libraries cache "up to date" with the most recent python environment that is built it the VHD.
This is why adding fsspec library "fixes" the issue, but adding this library shouldnt be strictly necesary.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
50 | |
28 | |
14 | |
14 | |
4 |
User | Count |
---|---|
65 | |
59 | |
22 | |
8 | |
7 |