The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
We consistently recieve Parquet not found exceptions in our current setup.
Context:
We have a medallion architecture in fabric (notebooks build silver layer data tables from bronze raw data from D365 F&O lakehouse).
Error:
We regularly recieve these errors and must perform fix steps regularly.
Fix:
Step 1: Go to source Semantic Model
Step 2: Click Edit Tables
Step 3: Confirm the schema
Step 4: Refresh report
Looking for:
Permanent solution. Is there a way we can mitigate the ParquetNotFound exception from happening in the first place?
Solved! Go to Solution.
Hi @ucherukuri
Thanks for the reply from GilbertQ .
This behavior is by design, and overriding the Direct Lake table requires a manual refresh of the semantic model to work properly.
In more detail, the repository uses the parquet file format, and deleting/re-creating a table deletes/re-created the delta log. Existing semantic models that are not refreshed after this will assume that the original delta log exists and cause a file not found error.
For more details, you can read related document link:
Delta Lake logs in Warehouse - Microsoft Fabric | Microsoft Learn
In addition to the method you are using now, there is a workaround for your reference:
Switching from overwrite mode to append mode prevents the delta log from being recreated, although it should be noted that not all data ingestion methods support append mode.
Best Regards,
Yulia Xu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @ucherukuri
Thanks for the reply from GilbertQ .
This behavior is by design, and overriding the Direct Lake table requires a manual refresh of the semantic model to work properly.
In more detail, the repository uses the parquet file format, and deleting/re-creating a table deletes/re-created the delta log. Existing semantic models that are not refreshed after this will assume that the original delta log exists and cause a file not found error.
For more details, you can read related document link:
Delta Lake logs in Warehouse - Microsoft Fabric | Microsoft Learn
In addition to the method you are using now, there is a workaround for your reference:
Switching from overwrite mode to append mode prevents the delta log from being recreated, although it should be noted that not all data ingestion methods support append mode.
Best Regards,
Yulia Xu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @ucherukuri
Are you using direct lake for your semantic models, or are you using the parquet connector import the data?
My semantic models are built with direct lake mode tables