Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi,
I have really odd issue...
I have an gen2 dataflow that copies a csv file from an on prem location into a delta table. It's set to overwrite and in general apepars to work fine. (Would prefer to be able to do that direct in data factory but it can't read on prem yet).
If I query it with the sql endpoint, it gives the expected data that matches the source csv:
If I read it into a spark dataframe, though I get old versions of the same records displaying:
The problem is I'm then using that dataframe to create what should be a unique dimenion.
I've tried specificying the latest timetravel version explicitly and it still does the same.
Only thing I can think is there is some oddity in the delta meta data being written by dataflows.
Thanks @Joshrodgers123 at least that makes me feel like I'm not doing something silly!
I have the same issue, but reversed. I have a dataflow writing to a lakehouse. If I read the data in a notebook, I can see the correct and latest data. If I read the data through the SQL Endpoint (or even the table preview in the lakehouse), it shows an older version of the delta table.
I opened a support ticket (2312050040012594) and they are investigating. They said it was a sync issue.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
3 | |
2 | |
2 | |
1 | |
1 |