Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
Hi,
I have really odd issue...
I have an gen2 dataflow that copies a csv file from an on prem location into a delta table. It's set to overwrite and in general apepars to work fine. (Would prefer to be able to do that direct in data factory but it can't read on prem yet).
If I query it with the sql endpoint, it gives the expected data that matches the source csv:
If I read it into a spark dataframe, though I get old versions of the same records displaying:
The problem is I'm then using that dataframe to create what should be a unique dimenion.
I've tried specificying the latest timetravel version explicitly and it still does the same.
Only thing I can think is there is some oddity in the delta meta data being written by dataflows.
Thanks @Joshrodgers123 at least that makes me feel like I'm not doing something silly!
I have the same issue, but reversed. I have a dataflow writing to a lakehouse. If I read the data in a notebook, I can see the correct and latest data. If I read the data through the SQL Endpoint (or even the table preview in the lakehouse), it shows an older version of the delta table.
I opened a support ticket (2312050040012594) and they are investigating. They said it was a sync issue.
Check out the March 2025 Fabric update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.