The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi All,
Is there any cold storage (archival) process in microsoft Fabric. How do we move old data to cold storage.
Solved! Go to Solution.
Hi @Anonymous ,
Thank you @nilendraFabric for providing the possible solution.Here is the link which may help you in ressolving the issue.
Solved: Re: Lakehouse Files archival - Microsoft Fabric Community
Thank you for reaching out to us on the Microsoft Fabric Community Forum.
Regards,
Menaka.
Hi @Anonymous ,
Thank you @nilendraFabric for providing the possible solution.Here is the link which may help you in ressolving the issue.
Solved: Re: Lakehouse Files archival - Microsoft Fabric Community
Thank you for reaching out to us on the Microsoft Fabric Community Forum.
Regards,
Menaka.
Hi @Anonymous ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @Anonymous ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you and Regards,
Menaka.
Hi @Anonymous ,
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.
Thank you.
Hello @Anonymous
OneLake currently lacks built-in archival tiers
Archival to cold storage have to done using alds gen2 at the moment
Enable hierarchical namespace (required for Delta/Parquet compatibility).
Add a shortcut pointing to your ADLS Gen2 container/path (e.g., `abfss://archive@storageaccount.dfs.core.windows.net/`).
Create a pipeline to move data from onelake to adls
Automatically tier data to Cold (90-day retention) or Archive (180-day retention) based on access patterns using data lifecycle in adls
Hope this helps
please accept the answer if this is useful
Deploy an ADLS Gen2 account in the same Azure region as your Fabric tenanT