Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
jaryszek
Memorable Member
Memorable Member

How to do incremental refresh using datalake one lake tables ?

Hello,

like in the topic, how to do incremental refresh using datalake one lake tables (Fabric). 

I have tables coming from azure blob storage into one Lake in Fabric. 
How to set up incremental refresh ? I have delta parquets tables with year, month and day in the name...

Best,
Jacek

1 ACCEPTED SOLUTION
v-agajavelly
Community Support
Community Support

Hi @jaryszek ,

Sorry for delay in responce. Good point  in Direct Lake mode you’re right, Power Query isn’t available so you can’t set up incremental refresh the usual way. In that case the trick is to manage incrementality at the data source / lakehouse level.

  • Since your tables are delta parquet partitioned by year/month/day, you can control what gets landed into the Lakehouse table (for example via pipelines or notebooks).
  • Direct Lake will then pick up those new partitions automatically without a full reload.
  • If you need true “incremental refresh policy” (like RangeStart/RangeEnd filtering), that’s only supported in Import / DQ mode today, not Direct Lake.

So you can try  bellow  ways.
With Direct Lake = keep your data partitioned properly and let Fabric read the latest partitions.
With Import = use Power Query + incremental refresh policy.

Thanks for calling this out — it’s an important distinction between Direct Lake vs. Import.

Thanks,
Akhil.

View solution in original post

3 REPLIES 3
v-agajavelly
Community Support
Community Support

Hi @jaryszek ,

Sorry for delay in responce. Good point  in Direct Lake mode you’re right, Power Query isn’t available so you can’t set up incremental refresh the usual way. In that case the trick is to manage incrementality at the data source / lakehouse level.

  • Since your tables are delta parquet partitioned by year/month/day, you can control what gets landed into the Lakehouse table (for example via pipelines or notebooks).
  • Direct Lake will then pick up those new partitions automatically without a full reload.
  • If you need true “incremental refresh policy” (like RangeStart/RangeEnd filtering), that’s only supported in Import / DQ mode today, not Direct Lake.

So you can try  bellow  ways.
With Direct Lake = keep your data partitioned properly and let Fabric read the latest partitions.
With Import = use Power Query + incremental refresh policy.

Thanks for calling this out — it’s an important distinction between Direct Lake vs. Import.

Thanks,
Akhil.

v-agajavelly
Community Support
Community Support

Hi @jaryszek ,

Thanks for raising this. Since your data is already in delta parquet with year/month/day folders, you can leverage Fabric’s incremental refresh at the semantic model level. Just create RangeStart / RangeEnd parameters in Power Query, filter on your date column, and then configure incremental refresh in the dataset settings. Fabric will push filters down to your OneLake delta table so only new partitions are scanned. This way you avoid reloading full history every time and only process the latest data. Thanks to our super users for sharing these best practices earlier  they really make it easier to set up.

Regards,
Akhil.

Ok but the issue is that I can not transform any data (power query is not working) where I am connecting to One Lake :

jaryszek_0-1756793133084.png


Power query is not available, it is not SQL endpoint there but direct lake...

What now? 🙂 

Best,
Jacek

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.