Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Fabric Ideas just got better! New features, better search, and direct team engagement. Learn more

Ultra Epic Idea: Allow Direct Delta Log & Parquet Access for Mirrored Databases

We (and I'm sure many others) need to be able to incrementally transform data from fabric mirrored databases into elevated tiers of medallion archetecture. We do not want to make any alterations at source to leverage a watermarking approach (that requires more work). If we could simply hit the delta log and files of the mirrored database (you can see them exposed via a shortcut from a lakehouse but can't actually interact with them from a notebook), we could efficiently and incrementally transform data into higher tiers. Please expose delta logs and parquet files of fabric mirrored databases! Wack and load approaches are not sustainable! 

 

Thanks! 

Status: New
Comments
_dbar_
Frequent Visitor
Brad_Dean
Regular Visitor
This would be a huge benefit to our processes
Markpm_msft
Microsoft Employee

This already exists, you do have direct access to the underlying delta tables. 
Some people create a seperate lakehouse and create shortcuts to the Mirrored database, but you dont need to.  

The only downside, you need to access via the ABFS path.   You can look at the files (delta logs and parquet files, delete vectors,etc) using Azure storage explorer or Onelake data explorer.

 

 

dbWizard
Advocate I

@Markpm_msft- can you please direct me to any example of anyone leveraging the abfss path to a delta log of a mirrored fabric database to incrementally transform only what is new changed or removed into a silver layer with a notebook? Have you done or seen this personally? Also please see this post: https://community.fabric.microsoft.com/t5/Data-Engineering/Is-it-possible-yet-to-leverage-delta-log-...

Markpm_msft
Microsoft Employee

I put together a demo using timetravel in delta, to get the differences between two different versions.  (I would have to dig it out)

Delta Lake Time Travel | Delta Lake

 

What you really want is the delta change feed, which is on the roadmap. Planned for Q3.

Markpm_msft_2-1749563344497.png

 

But to answer your orginal question, below is how to find the path and use it in spark.

 

If you goto SQL Endpoint, find the table, look at the properties.  You can get the ABFS path.

Markpm_msft_0-1749561910572.png

Create a notebook and just paste the ABFS path in. 

df = spark.read.format("delta").load("<paste in ABFS path>")
display(df)

Markpm_msft_1-1749562469873.png