Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
joshua1990
Post Prodigy
Post Prodigy

Ingesting CSV files and appending them with ODBC Data

I am working on building a table in our Lakehouse that consolidates data from both CSV files and an ODBC connection. However, I encountered some challenges regarding the upload of CSV files directly in lakehouse GUI; specifically, the column types are not defined, which is leading to issues.

To address this, I have created a dataflow that consolidates the CSV files into a single table. Now, I would like to extend this table by appending data retrieved via the ODBC connection. The data sourced from the ODBC connection mirrors the data contained in the CSV files, which we save quarterly before deleting it from its original source.

 

Is there a way to verify whether the information from the CSV files is already present before appending the ODBC data to the Lakehouse table? Essentially, I would like to perform a check for existing records prior to the appending process.

Your insights on this process would be greatly appreciated! Maybe a dataflow is not the best approach here

4 REPLIES 4
v-lgarikapat
Community Support
Community Support

Hi @joshua1990 

Thanks for reaching out to the Microsoft fabric community forum.

@lbendlin ,

Thanks for your prompt response

@joshua1990 ,

 

Step-by-Step Approach to Append ODBC Data to Lakehouse after Checking CSV History

1.Standardize Schema Early

You’ve likely done this in your dataflow, but just to be safe: define explicit column types and ensure both CSV and ODBC sources match. You can do this in Power Query inside your Dataflow by setting each column's data type manually  this avoids schema drift or mismatches downstream

2.Stage ODBC Data in a Temporary Table

Before appending to your main table, load the ODBC data into a staging table in the Lakehouse (e.g., stg_odbc_quarterly). You can ingest using Dataflow Gen2, Pipelines, or even a Notebooks-based Fabric data task

3.Deduplicate with a SQL Statement

Once you have both datasets available in the Lakehouse, use a MERGE or INSERT ... SELECT SQL statement with a NOT EXISTS or LEFT JOIN condition to append only new records:

INSERT INTO final_consolidated_table

SELECT *

FROM stg_odbc_quarterly AS o

WHERE NOT EXISTS (

    SELECT 1

    FROM final_consolidated_table AS f

    WHERE f.primary_key = o.primary_key

);

Make sure the primary_key (or composite key) you're comparing is consistent between both CSV and ODBC sources.

 

If this post helped resolve your issue, please consider the Accepted Solution. This not only acknowledges the support provided but also helps other community members find relevant solutions more easily.

We appreciate your engagement and thank you for being an active part of the community.

Best regards,
LakshmiNarayana
.

lbendlin
Super User
Super User

Two things to keep in mind

 

1. CSV data sources by definition do not fold.

2. The smallest granularity you can achieve with non-folding data sources is the partition level. The smallest partition you can create via normal means is a daily partition

 

Ideally you would load all that data into a SQL database (in Fabric or elsewhere) and then do the deduplication there.

@lbendlin : Why not just using a dataflow instead of creating a full SQL database for the deduplication?

A dataflow is just a bunch of CSV files in Azure Blob storage. Duplication of storage for no functional gain.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.