Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hello all,
we have the following problem:
We are pushing data on a regular basis from an on prem system to a delta table in an Azure datalake using deltatable python library.
Files are written every 10 minutes. Then we shortcut this table to a Fabric Lakehouse.
Now I get the following error:
Delta table 'Head_Sk2x' has atleast '100' transaction logs, since last checkpoint. For performance reasons, it is recommended to regularly checkpoint the delta table more frequently than every '100' transactions. As a workaround, please use SQL or Spark to retrieve table schema.
I am running a daily clean up job on the table, which takes all the (relativly small) files using OPTIMIZE and VACUUM. But I still get the error.
I am also not able to create checkpoints via this clean up notebook.
Am I missing somethink?
A second question, can we directly push the data to the lakehouse table without using the workaround via ADL?
Any help is welcome
Solved! Go to Solution.
Hi @paulv ,
Can you please run this code in a notebook, then refresh the lakehouse table that had the erro.
%%spark
import org.apache.spark.sql.delta.DeltaLog
DeltaLog.forTable(spark,"Tables/yourtablenamehere").checkpoint()
Note: The above code is in Scala.
Maybe it will work for you. Please let me know incase of further queries.
Hi @Anonymous,
We are collecting data from production side via Siemens Edge devices.
The data can be preprocessed there and we run a python based container where we use the deltalake library to write the data to ADL.
Here is the screenshot of the issue.
Hi @paulv ,
Can you please run this code in a notebook, then refresh the lakehouse table that had the erro.
%%spark
import org.apache.spark.sql.delta.DeltaLog
DeltaLog.forTable(spark,"Tables/yourtablenamehere").checkpoint()
Note: The above code is in Scala.
Maybe it will work for you. Please let me know incase of further queries.
Hi @Anonymous ,
Thanks a lot, this solves the error.
I includes this in the clean up notebook and it works fine.
Hi @paulv ,
Glad to know your issue got resolved. Please continue using Fabric Community for your further queries.
Hi @paulv ,
Thanks for using Fabric Community.
I would like to understand where are you running the delta table python library?
How are you moving the data from On prem to Delta Table - is it using Fabric Notebooks?
If possible can you please share the screenshot of the issue?
User | Count |
---|---|
15 | |
15 | |
15 | |
11 | |
8 |
User | Count |
---|---|
38 | |
30 | |
28 | |
23 | |
16 |