Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I need to ingest data in real-time from Azure SQL Database with CDC enabled into a Microsoft Fabric Lakehouse. The setup involves:
Key requirements:
Questions:
Architecture:
What would be the best architecture within Microsoft Fabric to handle such a large-scale ingestion task efficiently? Is there an established pattern for managing this volume and velocity of data in real-time?
Schema Evolution:
Are there any built-in tools or best practices in Delta Lake or Lakehouse for dynamically handling schema evolution (e.g., column additions, type changes) across multiple tables?
Real-Time Streaming with Fabric Notebooks:
Parallelism and Automation:
What is the best way to achieve parallelism and automation for processing 1000+ tables in real time? Is there a preferred way to orchestrate multiple table ingestion tasks while maintaining low latency and high performance?
Thank you
Solved! Go to Solution.
Hi @PrachiJain_2025 ,
Let me answer your questions specifically.
If the suggestions above do not provide you an acceptable solution, please provide examples of specific scenarios you are encountering and what challenges you are facing with the above solutions.
Hello @DataBard ,
Thank you so much for the suggestions!
1 . Database Mirroring will not work as there is 500 tables limitations, we have 500+ tables, I don't want to create 2 subset of main database , I want to setup this for mutiple database.
2. Streaming Azure SQL DB CDC into a Real-time Hub - Will this opton handle Schema Evolution? The event stream can handle row-level changes and will keep updating the data incrementally. From Azure SQL CDC to the event stream, my destination is the Lakehouse. How will it handle schema evolution?"
Thank you so much!
Hi @PrachiJain_2025 ,
Let me answer your questions specifically.
If the suggestions above do not provide you an acceptable solution, please provide examples of specific scenarios you are encountering and what challenges you are facing with the above solutions.
User | Count |
---|---|
3 | |
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
3 | |
2 | |
1 | |
1 | |
1 |