Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedJoin us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now
I am aware of the fact that you have to use a notebook to delete rows from a lakehouse table but every method I have used gives me the same error. I have tried Spark SQL and two methods using Python and they all come back with the following error. What am I doing wrong?
Error while decoding: requirement failed: Mismatched minReaderVersion and readerFeatures.
Method #2:
Method #3:
So it is completely unreadable from a notebook. Very strange because I put a semantic model over top of it and it is able to read it just fine. Not sure what the problem is for Spark, but DAX must read it differently.
Hi @aarongiust ,
Can you try the below on your table and then see if it works out?
ALTER TABLE <table-identifier> SET TBLPROPERTIES('delta.minReaderVersion' = '1', 'delta.minWriterVersion' = '2')
I get the same error when I run the ALTER statement.
Hi @aarongiust ,
Apologies for the issue you are facing, it require a deeper investigation from our engineering team about your workspace and the logic behind it to properly understand what might be happening.
Please go ahead and raise a support ticket to reach our support team: support-ticket
Please provide the ticket number here as we can keep an eye on it.
I submitted ticket #2401030040011861.
Hello @aarongiust ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
Hi @aarongiust ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others. Otherwise, will respond back with the more details and we will try to help .
Are you able to read data without issue?
I believe its a bug in Fabric. There seems to be some corruption in the JSON file within the _delta_log folder of the table. You can try opening the JSON file of this table and another table where you are able to delete data and compare it.
And for fixing this, I think the best approach would be to read the data in spark and write it as a new table in lakehouse and then see if it solves the version error.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code FABINSIDER for a $400 discount!
Check out the February 2025 Fabric update to learn about new features.
User | Count |
---|---|
26 | |
19 | |
5 | |
3 | |
2 |
User | Count |
---|---|
31 | |
25 | |
16 | |
12 | |
12 |