Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
ge83rep
New Member

How to fully load data once and then incrementally update with yesterday’s data in Dataflow Gen2?

Hi all,

I'm new to Dataflow Gen2 and exploring it to centralize data transformations for Power BI reporting. I’m using F2 capacity and pulling data from multiple sources like SQL Server, Salesforce, and Microsoft Business Central.

While building the dataflows, I didn’t define any explicit output data destination (bottom left panel in the dataflow UI).

ge83rep_3-1753367245908.png

 

 

But I noticed that files like “StagingLakehouseForDataflows_...” are automatically getting created in my workspace under type “Lakehouse” or “Warehouse”.

ge83rep_1-1753367110713.png

 

  • What exactly are these files?

  • Is it okay to leave the output destination undefined if I can still see the tables and use them in Power BI?

Incremental Refresh Doubts:

I tried setting up incremental refresh on one of the queries (Currency Exchange Rate). This table has three columns:

  • Currency Code

  • Starting Date

  • Currency Exchange Rate

There’s historical data available from 2020 onwards. Since the exchange rates update daily, my goal is to:

  1. Load the full historical dataset initially

  2. Then, refresh only yesterday’s data daily (i.e., append new records to the existing dataset)

So I used these settings in Incremental Refresh:

  • Date column: Starting Date

  • Extract data from the past: 1 day

  • Bucket size: Day

    ge83rep_4-1753367412076.png

     

But after enabling and running the refresh, I noticed that only records with yesterday’s date were loaded — the older data (from 2020 onwards) wasn't loaded at all.

What I need:

  • I want the dataflow to load the entire dataset initially, and then on each scheduled refresh, only add new records from yesterday (without removing the older ones).

  • How can I properly configure incremental refresh to support this pattern in Dataflow Gen2?


Thanks in advance for any help or clarification!

3 REPLIES 3
lbendlin
Super User
Super User

Please follow the documentation.  Your scenario may not be supported.

 

Incremental refresh in Dataflow Gen2 - Microsoft Fabric | Microsoft Learn

Hi @ge83rep 

Just following up on your query regarding Dataflow Gen2 output destinations and setting up incremental refresh for your Currency Exchange Rate table.

As noted earlier, @lbendlin  has shared the official documentation on configuring incremental refresh in Dataflow Gen2, which outlines the steps needed to ensure an initial full load followed by daily incremental loads.

Whenever you get a chance, could you confirm if the guidance helped to resolve the issue.

We’re happy to assist further if needed.

Looking forward to your response!

Hi @ge83rep 

Just a quick reminder that @lbendlin shared the official documentation that covers how Dataflow Gen2 handles output destinations and how to configure incremental refresh correctly to support full historical load followed by daily appends.

 

  • The auto-created “StagingLakehouseForDataflows ” is expected behavior when no explicit destination is defined  Dataflow Gen2 stages data in a temporary Lakehouse.

  • For incremental refresh, the document explains how to ensure an initial full load and correctly partition for daily updates without losing historical data.

Whenever you get a chance, please review the provided link and let us know if anything needs further clarification or adjustment for your setup.

Thank You.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors