Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Dears,
My architecture in fabric is :
Raw->Bronze (with history)->Silver->Gold
The topic of this post is related with Raw and Bronze
In Raw, we are consuming different data sources.
For example, we have a Dataverse Direct Link adding information into our Raw.
1) Behavior of Direct Link:
There is a Dataverse Direct Link created at dataverse side which continuously sends shortcuts into our RAW.
The DV Direct link creates, by itself, a lakehouse in our raw workspace and keeps it up to date with all the tables
Information keeps arriving and when there is something deleted, it has a flag called isdelete (so that we are aware of deletes at source)
Then, every 30 days, those deletes get a hard delete from this lakehouse in our raw. All of this managed by Dataverse Direct link.
Then, In bronze, we do incremental loads from this raw tables using a column which is called SynkmodifiedOn, hence we always have all the history in bronze including the deletes
The challange is how to consume deletes in the same way from SQL Server 2008 DB
2) Now we have new data source, its SQL Server 2008 on-prem
We are considering doing CDC, and we would like to have the same capacity in terms of knowing when something was deleted
I was thinking in doing something like this:
In SQL Server :
1) Activate CDC at DB level and table level in SQL Server
2) Ensure SQL Agent is running
3)Configure CDC retention and Monitoring capture/cleanupjobs for 7 days period
In Fabric,
Create a connection to SQL Server (via a on-premisses data gateway instealled on a VM that can access the SQL Server)
Create a copy job and select CDC mode
Create a lakehouse in Raw
Point the copy job to it
Schedule the Copy job to run 1 a day (we don't need more at this time)
Authentication (seems that because the SQL Server is on-prem) I cannot use a service principal , I have to use windows authentication via the gateway or SQL authentication
But I heard that this way the copy job will not be able to bring into raw the information of the records that were deleted.
This is challeging because we would like to have something like a flag that we can use to know when something was deleted
Can someone please help? Maybe I should not use the copy job to bring the infromation ? should instead use a copy task from Fabric Data Factory?
Do you think the above list of requirements is correct? or am i lacking something?
Thanks a lot for an note
Best regards,
Pedro
Solved! Go to Solution.
Hi @fabricpribeiro ,
Thank you for reaching out to the Microsoft Community Forum.
Hi @tayloramy , Thank you for your prompt response.
Hi @fabricpribeiro , Use Fabric Data pipelines -->Copy Activity in CDC mode instead of Copy job, CDC in Copy Job (Preview) have some limitations. Fabric Data Factory Copy Activity supports CDC deletes, it reads from the CDC tables (cdc._CT) including delete, insert, update (before) and update (after).
Note: It reads from CDC change tables (cdc.__CT), Captures __$operation and writes operation metadata into Lakehouse.
Please refer below link.
Change data capture (CDC) in Copy Job - Microsoft Fabric | Microsoft Learn
What is change data capture (CDC)? - SQL Server | Microsoft Learn
Change data capture - Azure Data Factory & Azure Synapse | Microsoft Learn
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
Hi @fabricpribeiro,
Fabric Mirroring doesn't support SQL Server 2008, so you will need to do some things yourself.
CDC is the way to go. You will need to have somewhat complicated logic however. First, you will need to pull up the change tables which will tell you what rows were inserted, updated, or deleted.
From there, you will need logic in Fabric to delete any deleted records from your table, and then for inserted or updated records, you will need to pull those from the main table, but using the change table to filter what records to pull.
It's a little more complicated than if you could use Mirroring, but it is workable
Proud to be a Super User! | |
Thanks for your reply, yap we know we cannot use mirroring
We want to use CDC
Question was more :
Given the behavior we would like to have, shall we use Copy Job from Fabric or Shall we instead use Fabric data factory copy task to achieve what we wish?
Thanks a lot,
Pedro
Hi @fabricpribeiro ,
Thank you for reaching out to the Microsoft Community Forum.
Hi @tayloramy , Thank you for your prompt response.
Hi @fabricpribeiro , Use Fabric Data pipelines -->Copy Activity in CDC mode instead of Copy job, CDC in Copy Job (Preview) have some limitations. Fabric Data Factory Copy Activity supports CDC deletes, it reads from the CDC tables (cdc._CT) including delete, insert, update (before) and update (after).
Note: It reads from CDC change tables (cdc.__CT), Captures __$operation and writes operation metadata into Lakehouse.
Please refer below link.
Change data capture (CDC) in Copy Job - Microsoft Fabric | Microsoft Learn
What is change data capture (CDC)? - SQL Server | Microsoft Learn
Change data capture - Azure Data Factory & Azure Synapse | Microsoft Learn
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
Hi @fabricpribeiro ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.
Regards,
Dinesh
Hi @fabricpribeiro ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.
Regards,
Dinesh
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 29 | |
| 14 | |
| 13 | |
| 6 | |
| 6 |
| User | Count |
|---|---|
| 55 | |
| 45 | |
| 28 | |
| 18 | |
| 11 |