Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hi Team,
In Azure Blob we have life cycle management to move the data to cool, cold, hot, archival so similarly do we have anything in Lakehouse. So that it can help us to save cost for unused/rarely used files.
Solved! Go to Solution.
HI @Srisakthi,
Current Lakehouse seems not include file archive features. Perhaps you can try to use pipeline to output these cold data to azure side or use notebook to save these cold files into subfolder of the Lakehouse ‘Files’ level.(you can explorer export these files to local or OneDrive)
Access Fabric data locally with OneLake file explorer - Microsoft Fabric | Microsoft Learn
If you want to use them, you can simply use dataflow/pipeline/shortcut to be getting data back to the Lakehouse.
Regards,
Xiaoxin Sheng
@Srisakthi you can create an idea for this here: https://ideas.fabric.microsoft.com/
Let me know if you do, I will vote ☺️
Hi @frithjof_v ,
I have submitted an idea
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=8e7f07a0-294f-ef11-b4ac-6045bd8620f1
HI @Srisakthi,
Current Lakehouse seems not include file archive features. Perhaps you can try to use pipeline to output these cold data to azure side or use notebook to save these cold files into subfolder of the Lakehouse ‘Files’ level.(you can explorer export these files to local or OneDrive)
Access Fabric data locally with OneLake file explorer - Microsoft Fabric | Microsoft Learn
If you want to use them, you can simply use dataflow/pipeline/shortcut to be getting data back to the Lakehouse.
Regards,
Xiaoxin Sheng
Hi @Anonymous ,
Thank you ! I understand we have to do some work around to achieve this. But thought of checking if Fabric can have/bring something similar feature in Azure Blob