The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi,
I have really odd issue...
I have an gen2 dataflow that copies a csv file from an on prem location into a delta table. It's set to overwrite and in general apepars to work fine. (Would prefer to be able to do that direct in data factory but it can't read on prem yet).
If I query it with the sql endpoint, it gives the expected data that matches the source csv:
If I read it into a spark dataframe, though I get old versions of the same records displaying:
The problem is I'm then using that dataframe to create what should be a unique dimenion.
I've tried specificying the latest timetravel version explicitly and it still does the same.
Only thing I can think is there is some oddity in the delta meta data being written by dataflows.
Thanks @Joshrodgers123 at least that makes me feel like I'm not doing something silly!
I have the same issue, but reversed. I have a dataflow writing to a lakehouse. If I read the data in a notebook, I can see the correct and latest data. If I read the data through the SQL Endpoint (or even the table preview in the lakehouse), it shows an older version of the delta table.
I opened a support ticket (2312050040012594) and they are investigating. They said it was a sync issue.
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Fabric update to learn about new features.
User | Count |
---|---|
3 | |
2 | |
2 | |
1 | |
1 |