Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hi,
We've setup a mirrored database. I can used the SQL endpoint in the eventstream but no events are passed. I'm not seeing a way to connect directly to the mirrored database in an event stream. Is this achievable or will we need to use a standard SQL instance with CDC enabled?
I've also tried stream analytics and a mirrored DB cannot be accessed. Again, the endpoint can be used but no events are passed in the event stream.
Goal:
Snowflake DB-->Mirrored DB(Fabric)-->Event Stream-->PBI Report
Solved! Go to Solution.
Hi @robertpayne21 ,
Mirrored databases in Fabric are designed primarily for replication and analytics-ready data storage, rather than direct event streaming. They replicate your existing data estate into Fabric's OneLake in an analytics-ready format, such as Parquet, and are typically used for batch analytics rather than real-time event processing. This might explain why you're not seeing events passed when attempting to use a mirrored database as a direct source for an event stream.
For real-time event processing and streaming into Power BI, you would generally need a source that can generate or capture events in real-time, such as Azure Event Hubs or IoT Hubs, which can then be connected to an event stream in Fabric.
There are some possible methods for your reference:
1. Consider using a standard SQL instance with Change Data Capture (CDC) enabled. CDC captures changes in the database and can be used as a source for your event stream.
The flow would be: Snowflake DB → Standard SQL DB with CDC → Event Stream → Power BI Report.
2. Direct Snowflake to Event Hub Integration->Event Hub to Fabric Event Stream->Event Stream to Power BI.
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @robertpayne21 ,
Mirrored databases in Fabric are designed primarily for replication and analytics-ready data storage, rather than direct event streaming. They replicate your existing data estate into Fabric's OneLake in an analytics-ready format, such as Parquet, and are typically used for batch analytics rather than real-time event processing. This might explain why you're not seeing events passed when attempting to use a mirrored database as a direct source for an event stream.
For real-time event processing and streaming into Power BI, you would generally need a source that can generate or capture events in real-time, such as Azure Event Hubs or IoT Hubs, which can then be connected to an event stream in Fabric.
There are some possible methods for your reference:
1. Consider using a standard SQL instance with Change Data Capture (CDC) enabled. CDC captures changes in the database and can be used as a source for your event stream.
The flow would be: Snowflake DB → Standard SQL DB with CDC → Event Stream → Power BI Report.
2. Direct Snowflake to Event Hub Integration->Event Hub to Fabric Event Stream->Event Stream to Power BI.
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous, my apologies for the delayed response. I will give this a test and circle back. Thank you for your response, it is appreciated.
Thanks,
Robert
Hi @robertpayne21 ,
Has your problem been solved? If the problem has been solved you can mark the reply for the standard answer to help the other members find it more quickly. If not, please point it out. Thanks in advance.
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.
Check out the September 2025 Fabric update to learn about new features.