Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I had been struggling for about 10 minutes to read a file from the lakehouse.
I mounted like so:
... then I added new directories and a new file in to the lakehouse "Files".
The file was called "Partition.xml".
Then I tried to read the Partition.xml file (below) and it kept failing:
FileNotFoundError: [Errno 2] No such file or directory: '/synfs/notebook/xxxxxxx-cf8e-45a8-9097-15f0db4f3883/LakehouseFiles/InventoryManagement/InventoryAgings/Partition.xml'
Finally after about ten minutes it worked. What a nightmare.... Can someone tell me if this is expected behavior for the "synfs" mounts? This crappy behavior sounds like the par for the course, based on my other experiences on this platform, but ten minutes is way too long. Is there anyway to force the mssparkutils.fs mounts to behave better? Eg. should I synch on demand?
Solved! Go to Solution.
Hi @dbeavon3,
You’re not imagining it - the /synfs path you get back from mssparkutils.fs.getMountPath(...) is an eventually-consistent projection of OneLake into the notebook’s file system. When you add or rename files outside the running Spark session (e.g., via the Lakehouse UI, another job, or upload), it can take a while before the mount’s directory listing catches up. That’s why your open("/synfs/notebook/.../Partition.xml") failed for several minutes and then “magically” started working.
Two workarounds:
import mssparkutils mssparkutils.fs.refreshMounts()This forces the workspace/job mount metadata to resync so new files show up faster. See: Microsoft Spark Utilities docs and the refreshMounts reference notes here: Synapse mount API (same APIs).
import fsspec abfss_path = "abfss://<workspaceId>@onelake.dfs.fabric.microsoft.com/<LakehouseName>/Files/InventoryManagement/InventoryAgings/Partition.xml" with fsspec.open(abfss_path, "rb") as f: full_doc = f.read()This avoids the /synfs cache entirely. Nice walkthrough: Using fsspec with OneLake.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @dbeavon3,
You’re not imagining it - the /synfs path you get back from mssparkutils.fs.getMountPath(...) is an eventually-consistent projection of OneLake into the notebook’s file system. When you add or rename files outside the running Spark session (e.g., via the Lakehouse UI, another job, or upload), it can take a while before the mount’s directory listing catches up. That’s why your open("/synfs/notebook/.../Partition.xml") failed for several minutes and then “magically” started working.
Two workarounds:
import mssparkutils mssparkutils.fs.refreshMounts()This forces the workspace/job mount metadata to resync so new files show up faster. See: Microsoft Spark Utilities docs and the refreshMounts reference notes here: Synapse mount API (same APIs).
import fsspec abfss_path = "abfss://<workspaceId>@onelake.dfs.fabric.microsoft.com/<LakehouseName>/Files/InventoryManagement/InventoryAgings/Partition.xml" with fsspec.open(abfss_path, "rb") as f: full_doc = f.read()This avoids the /synfs cache entirely. Nice walkthrough: Using fsspec with OneLake.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Thanks. I wanted to use more conventional python libraries (os). Sometimes that makes it easier for AI assistance, and for migrating solutions to other platforms as needed (or for simply copy/pasting sample pyspark scripts to other standard python solutions).
I like the tip about refreshMounts. I will do that more frequently. Only about 1% of my operations read directly from files. Most of the file operations are for reading/writing blob data from abfss via pyspark.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.
User | Count |
---|---|
13 | |
5 | |
4 | |
3 | |
2 |