Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.

Reply
ucherukuri
Regular Visitor

Frequent report failures due to 404 Parquet Not Found Exceptions

ucherukuri_0-1723490868349.png

We consistently recieve Parquet not found exceptions in our current setup. 

 

Context:

We have a medallion architecture in fabric (notebooks build silver layer data tables from bronze raw data from D365 F&O lakehouse).

 

Error:

We regularly recieve these errors and must perform fix steps regularly.

 

Fix:

Step 1: Go to source Semantic Model

Step 2: Click Edit Tables

Step 3: Confirm the schema

Step 4: Refresh report 

 

Looking for:

Permanent solution. Is there a way we can mitigate the ParquetNotFound exception from happening in the first place?

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @ucherukuri 

 

Thanks for the reply from GilbertQ .

 

This behavior is by design, and overriding the Direct Lake table requires a manual refresh of the semantic model to work properly.


In more detail, the repository uses the parquet file format, and deleting/re-creating a table deletes/re-created the delta log. Existing semantic models that are not refreshed after this will assume that the original delta log exists and cause a file not found error.

 

For more details, you can read related document link:

Delta Lake logs in Warehouse - Microsoft Fabric | Microsoft Learn

 

In addition to the method you are using now, there is a workaround for your reference:

 

Switching from overwrite mode to append mode prevents the delta log from being recreated, although it should be noted that not all data ingestion methods support append mode.

 

Best Regards,
Yulia Xu

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

Hi @ucherukuri 

 

Thanks for the reply from GilbertQ .

 

This behavior is by design, and overriding the Direct Lake table requires a manual refresh of the semantic model to work properly.


In more detail, the repository uses the parquet file format, and deleting/re-creating a table deletes/re-created the delta log. Existing semantic models that are not refreshed after this will assume that the original delta log exists and cause a file not found error.

 

For more details, you can read related document link:

Delta Lake logs in Warehouse - Microsoft Fabric | Microsoft Learn

 

In addition to the method you are using now, there is a workaround for your reference:

 

Switching from overwrite mode to append mode prevents the delta log from being recreated, although it should be noted that not all data ingestion methods support append mode.

 

Best Regards,
Yulia Xu

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

GilbertQ
Super User
Super User

Hi @ucherukuri 

 

Are you using direct lake for your semantic models, or are you using the parquet connector import the data?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

My semantic models are built with direct lake mode tables

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors