Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more

Reply
Mayumi_2021
Frequent Visitor

Custom Delta Retention period for Lakehouse table

I want to know that if we set Delta Retention period for Lakehouse table to a very big value for e.g. 3650 days, does the performance degade over time? My daily data volume to the table is limited to maximum a thousand per day. 

I plan to run optimize daily or weekly.

1 ACCEPTED SOLUTION
nilendraFabric
Community Champion
Community Champion

Hello @Mayumi_2021 

 

Given your daily data volume of maximum a thousand records per day:
1. The impact on performance should be minimal with proper maintenance, considering the relatively small daily data volume.
2. Regular OPTIMIZE operations will help maintain good read performance by compacting small files.
3. Ensure you run VACUUM operations periodically to clean up old files, especially important with such a long retention period.

 

some basic guidelines around Retention Period and Performance
1. The default file retention threshold for Delta tables in Fabric is seven days.
2. Setting a longer retention period impacts Delta’s time travel capabilities but doesn’t necessarily degrade performance if managed properly.
3. It’s generally recommended to set a retention interval of at least seven days because old snapshots and uncommitted files may still be in use by concurrent table readers and writers

 

Optimization and Maintenance
To maintain performance with a long retention period:
1. Regular maintenance is crucial. Running OPTIMIZE operations, as you plan to do daily or weekly, is a good practice.
2. The VACUUM operation is important for removing old files no longer referenced by the Delta table log, which helps optimize storage costs.
3. Fabric provides a GUI option in the context menu of each table to run ad-hoc operations like OPTIMIZE and VACUUM

If this is helpful please accept this solution and give kudos 

 

View solution in original post

3 REPLIES 3
v-tsaipranay
Community Support
Community Support

Hello @Mayumi_2021 ,

 

Thank you for reaching out to the Microsoft fabric community forum.

 

The suggestion which is provide by @nilendraFabric  has already included everything and in addition to that your concerns are definitely something to consider, and it's great you’re already thinking about proper maintenance. The proactive approach you're planning with regular optimization and vacuuming, along with continuous monitoring will allow you to effectively handle the long Delta Retention period without major issues.

 

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you. 

Hi @Mayumi_2021

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.

 

Thank you.



nilendraFabric
Community Champion
Community Champion

Hello @Mayumi_2021 

 

Given your daily data volume of maximum a thousand records per day:
1. The impact on performance should be minimal with proper maintenance, considering the relatively small daily data volume.
2. Regular OPTIMIZE operations will help maintain good read performance by compacting small files.
3. Ensure you run VACUUM operations periodically to clean up old files, especially important with such a long retention period.

 

some basic guidelines around Retention Period and Performance
1. The default file retention threshold for Delta tables in Fabric is seven days.
2. Setting a longer retention period impacts Delta’s time travel capabilities but doesn’t necessarily degrade performance if managed properly.
3. It’s generally recommended to set a retention interval of at least seven days because old snapshots and uncommitted files may still be in use by concurrent table readers and writers

 

Optimization and Maintenance
To maintain performance with a long retention period:
1. Regular maintenance is crucial. Running OPTIMIZE operations, as you plan to do daily or weekly, is a good practice.
2. The VACUUM operation is important for removing old files no longer referenced by the Delta table log, which helps optimize storage costs.
3. Fabric provides a GUI option in the context menu of each table to run ad-hoc operations like OPTIMIZE and VACUUM

If this is helpful please accept this solution and give kudos 

 

Helpful resources

Announcements
MarchFBCvideo - carousel

Fabric Monthly Update - March 2025

Check out the March 2025 Fabric update to learn about new features.

March2025 Carousel

Fabric Community Update - March 2025

Find out what's new and trending in the Fabric community.