Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I have a Azure SQL db with CDC enabled as a source and I want the changes to be reflected in Fabric Lakehouse.
I have initially created a pipeline to copy the table from Azure SQL db to Fabric Lakehouse. Now , I have a eventstream (Snapshot Attached). The changes can be seen in the eventstream but they are now written on corresponding table on Lakehouse.
I have tried it previously like 2-3 months ago and it was working .
Could anyone help me resolve it ?
Hi @Sudip ,
Has your problem been solved? If it is solved, please share the workaround to help users with the same problem. Thanks in advance.
Best Regards,
Neeko Tang
Hi @Sudip ,
Ensure that the data types in your Azure SQL Database match those in your Fabric Lakehouse. The error message indicates a type conversion error, which often occurs when there is a mismatch in data types.
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
i get that but the problem is there is no datetime column in my source.
Hi @Sudip ,
Do you mean you want to add a datetime column? You can refer to the following operation:
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
No. i dont need any other extra columns. I am expecting the eventstream to write data on existing table inside Lakehouse.
here is the simple workflow:
1. Azure SQL db (CDC) as source, single table selected for now to pass into eventstream
2. Lakehouse as destination where the table already exists
3. Made some changes in the source table (which i expect to be reflected in lakehouse table)
4. Its not happening . I see the changed in EventStream but its not written in Lakehouse table.
Could you help on this??
Hi @Sudip ,
If you don't add the manage fields that step, add lakehouse as a destination and select create new table, you can't find the new table in lakehouse either?
If after refreshing you can see the new table in lakehouse then I think the problem still lies in the manage fields step, if you still can't see it I suggest you open a ticket and a dedicated Microsoft engineer will come to solve the problem for you.
It would be great if you continue to share in this issue to help others with similar problems after you know the root cause or solution.
The link of Power BI Support: Support | Microsoft Power BI
For how to create a support ticket, please refer to How to create a support ticket in Power BI - Microsoft Power BI Community
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Sudip ,
1.Verify Eventstream Configuration:
Ensure that the eventstream is correctly configured to capture changes from your Azure SQL Database with CDC enabled.
Check if the eventstream is connected to the correct table in the Lakehouse.
2.Check Permissions:
Make sure you have the necessary permissions in both the Azure SQL Database and the Fabric Lakehouse.
Verify that the CDC is enabled on the specific tables you are monitoring.
3. Ensure that your Azure SQL server is running and that the Azure SQL database must be publicly accessible and not located behind a firewall or protected in a virtual network.
4.You can go to lakehouse and find the table, then click refresh to see if there are any errors, and if there are no errors go back to eventstream again to see the data preview.
For more details, please refer: Add Azure SQL Database CDC source to an eventstream - Microsoft Fabric | Microsoft Learn
Add a lakehouse destination to an eventstream - Microsoft Fabric | Microsoft Learn
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
As you can also see here, we have input events but its not writing into lakehouse. Why so??
No. Its not resolved yet. Here is one warning maybe thats being an obstacle.
"Message": {
"value": "First Occurred: 8/7/2024 1:24:16 PM UTC | Resource Name: dst-adventureworks | Message: Source 'dst-adventureworks' had 1 occurrences of kind 'OutputDataConversionError.TypeConversionError' between processing times '2024-08-07T13:24:16.3491817Z' and '2024-08-07T13:24:16.3542642Z'. \r\n",
"type": "string"
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.