Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Antongig
New Member

Cannot Remove Invalid Table "_delta_log._commits" — SQL Sync Blocked

Hello,

I'm experiencing a persistent issue in Microsoft Fabric Lakehouse.

 

Lakehouse: Silver_Lakehouse
Issue: SQL Sync fails due to a phantom table _delta_log._commits.
Error Message:
Delta table 'Tables_delta_log_commits_delta_log' not found.
Table: _delta_log._commits
Error code: DeltaTableNotFound


What I’ve tried:
- There is no such folder in /Tables/ or /Tables/_delta_log/
- SHOW TABLES does not list _delta_log._commits
- I cannot drop or overwrite it due to invalid table name:
AnalysisException: Invalid table name silver_lakehouse._delta_log._commits

- It appears to be residual metadata from an interrupted write operation

 

Thanks,

Anton

8 REPLIES 8
BhaveshPatel
Community Champion
Community Champion

Hi @Antongig 

 

Why do you need Silver Lakehouse??? Could you please explain the whole scenarios?

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

Hi Bhavesh, thanks for replying!

I use Silver_Lakehouse as part of a production data pipeline where:

  • Tables are copied from Bronze to Silver using Spark notebooks and pipelines

  • SQL sync is required for warehouse access and reporting

  • Several downstream reports and jobs depend on these tables

The issue is that Fabric tries to sync a non-existent table _delta_log._commits, likely due to a failed write or interrupted notebook cell.
I’ve tried:

  • Dropping the table via Spark SQL (DROP TABLE)

  • Deleting folders via file paths

  • Recreating and overwriting the table

  • Listing all tables (it doesn't appear)

Nothing worked. Now SQL sync always fails.
I cannot delete the Lakehouse easily because many pipelines and references are linked to it by name.

Do you have any suggestion for clearing this ghost table or triggering a hard SQL sync refresh? 

Hi @Antongig 

 

Somewhere you deleted the transaction log file or data is corrupted due to _delta_log. For that, you need to delete the whole table ( single table ) and recreate the table using Python or scala commands which ever you think is relevant. 

 

Also, Which database are you using as a source. I mean to say Azure SQL. SQL Server...etc. 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

The Lakehouse uses Delta Lake format underneath on OneLake storage. 

I already tried deleting and recreating the table, but Fabric’s SQL sync still reports _delta_log._commits as a table, even though it doesn’t exist anymore. I tried:

spark.sql("DROP TABLE IF EXISTS Silver_Lakehouse.`_delta_log._commits`")

and I also tried:

tables_path = "/lakehouse/default/Tables"
for i in os.listdir(tables_path):
    if "commits" in i:
        print(i)

in my notebook in fabric. Both does absolutely nothing.

Hi @Antongig ,

Since the table cannot be removed through Spark, file APIs, or SQL commands and it’s impacting downstream workloads  it’s best to open a support request with Microsoft Fabric Support. 

Please refer below link on how to raise a contact support or support ticket. 
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn 


Regards,
Community Support Team.

Hi @Antongig ,

We’d like to follow up on your support ticket. Has the issue been successfully addressed?

If a solution was provided, we’d truly appreciate it if you could share your experience or any relevant details, your input could be valuable to other community members.

 

Thanks again for being an active part of the Microsoft Fabric Community!

Hi  @Antongig  ,
Thanks for reaching out to the Microsoft fabric community forum.
 

You could try using the Lakehouse REST API to list all the tables in your Lakehouse sometimes it shows hidden or stuck entries that don’t appear with ShowTables.

Look for the_delta_log._commits entry using the GET tables API call. If it shows up there, it means the metadata is still registered in the backend.

Here’s the link to the API documentation which may help you:
Lakehouse management API - Microsoft Fabric | Microsoft Learn

 

If I misunderstand your needs or you still have problems on it, please feel free to let us know.   

Best Regards,  
Community Support Team 

Hi @Antongig ,

I hope the information provided above assists you in resolving the issue. If you have any additional questions or concerns, please do not hesitate to contact us. We are here to support you and will be happy to help with any further assistance you may need.

Thank you.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.