Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi All,
I'm currently exploring the automation of data ingestion from various external databases into a Microsoft-based data lakehouse. I'm keen to understand if anyone in the community has experience with implementing a full Microsoft fabric solution for such a purpose.
Implementation Experiences: For those who have set up automated data ingestion into a lakehouse, could you share your insights on the tools and processes you utilized, particularly within the Microsoft ecosystem (Azure, Synapse, etc.)?
Data Flows and Architecture: What was your approach to building data flows? Did you employ Azure Data Factory, Azure Functions, or any other services? How did you ensure data consistency and reliability?
Challenges and Lessons Learned: What challenges did you encounter during the implementation, and how did you overcome them? Are there any lessons learned or best practices you can share, especially regarding data governance and security?
I am looking to gather real-world insights into building a robust and scalable data ingestion pipeline. Any shared experiences, architecture patterns, or advice on best practices would be invaluable and greatly appreciated.
Thank you for your time and help!
Solved! Go to Solution.
Hi @HamidBee ,
Microsoft provides a step-by-step guide on how to ingest data from an Azure storage account to the lakehouse using the Copy data activity of the Data Factory pipeline. You can also use Azure Functions to create custom data ingestion pipelines that can be triggered by events such as new data arriving in a storage account.
When building data flows, it’s important to consider the architecture of your data lakehouse.
Below the official link will help you:
Lakehouse tutorial - Ingest data into the lakehouse - Microsoft Fabric | Microsoft Learn
Best Regards,
Xianda Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Brilliant. Thanks for sharing.
Hi @HamidBee ,
Microsoft provides a step-by-step guide on how to ingest data from an Azure storage account to the lakehouse using the Copy data activity of the Data Factory pipeline. You can also use Azure Functions to create custom data ingestion pipelines that can be triggered by events such as new data arriving in a storage account.
When building data flows, it’s important to consider the architecture of your data lakehouse.
Below the official link will help you:
Lakehouse tutorial - Ingest data into the lakehouse - Microsoft Fabric | Microsoft Learn
Best Regards,
Xianda Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.