Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
We consistently recieve Parquet not found exceptions in our current setup.
Context:
We have a medallion architecture in fabric (notebooks build silver layer data tables from bronze raw data from D365 F&O lakehouse).
Error:
We regularly recieve these errors and must perform fix steps regularly.
Fix:
Step 1: Go to source Semantic Model
Step 2: Click Edit Tables
Step 3: Confirm the schema
Step 4: Refresh report
Looking for:
Permanent solution. Is there a way we can mitigate the ParquetNotFound exception from happening in the first place?
Solved! Go to Solution.
Hi @ucherukuri
Thanks for the reply from GilbertQ .
This behavior is by design, and overriding the Direct Lake table requires a manual refresh of the semantic model to work properly.
In more detail, the repository uses the parquet file format, and deleting/re-creating a table deletes/re-created the delta log. Existing semantic models that are not refreshed after this will assume that the original delta log exists and cause a file not found error.
For more details, you can read related document link:
Delta Lake logs in Warehouse - Microsoft Fabric | Microsoft Learn
In addition to the method you are using now, there is a workaround for your reference:
Switching from overwrite mode to append mode prevents the delta log from being recreated, although it should be noted that not all data ingestion methods support append mode.
Best Regards,
Yulia Xu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @ucherukuri
Thanks for the reply from GilbertQ .
This behavior is by design, and overriding the Direct Lake table requires a manual refresh of the semantic model to work properly.
In more detail, the repository uses the parquet file format, and deleting/re-creating a table deletes/re-created the delta log. Existing semantic models that are not refreshed after this will assume that the original delta log exists and cause a file not found error.
For more details, you can read related document link:
Delta Lake logs in Warehouse - Microsoft Fabric | Microsoft Learn
In addition to the method you are using now, there is a workaround for your reference:
Switching from overwrite mode to append mode prevents the delta log from being recreated, although it should be noted that not all data ingestion methods support append mode.
Best Regards,
Yulia Xu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @ucherukuri
Are you using direct lake for your semantic models, or are you using the parquet connector import the data?
My semantic models are built with direct lake mode tables