Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
monikapatil2612
Regular Visitor

Incremental refresh

Hi friends, I am using python script in power bi to connect Odoo server to get data.I used XML RPC.It hits it's API limit and failed to get refreshed.it has large size of data.How to set incremental refresh on that dataset?

1 ACCEPTED SOLUTION

Hello @my_patil ,

 

If you company use Microsoft products, you can use Azure SQL Database. 

And for python script you can use Azure Functions. It runs on a schedule and it is serverless (no virtual machine to manage).

 

After writing to Azure SQL Database, you can apply incremental refresh policies.

In Azure SQL DB, create table and add updated_at and write_date columns.

In python script, you can use xmlrpc.client library.

 

Hope it gives you an idea.

 

View solution in original post

14 REPLIES 14
v-pnaroju-msft
Community Support
Community Support

Hi monikapatil2612,

We are following up to see if what we shared solved your issue. If you need more support, please reach out to the Microsoft Fabric community.

Thank you.

my_patil
New Member

No I am not that much enough to write pyhton script to load data to database but I can do..Can you guide me on which database should I select and what is best option among different cloud to store python script and databse ?

 

 

 
v-pnaroju-msft
Community Support
Community Support

Thankyou, @anilelmastasi@rodrigosan and @lbendlin  for your responses.

Hi monikapatil2612,

We appreciate your inquiry through the Microsoft Fabric Community Forum.

We would like to inquire whether have you got the chance to check the solutions provided by @anilelmastasi@rodrigosan and @lbendlin  to resolve the issue. We hope the information provided helps to clear the query. Should you have any further queries, kindly feel free to contact the Microsoft Fabric community.

Thank you.

rodrigosan
Responsive Resident
Responsive Resident

Hy @monikapatil2612 
In my case, I have a SharePoint folder where users upload files every day, and the datasets have grown to more than 6 million rows. Performance became a huge problem, and refresh times were unacceptable. Because of corporate restrictions and cost limitations, I couldn’t use any other storage solutions like a database.

What I did was create two dataflows:

  • One dataflow for the historical data (which doesn’t change).

  • Another dataflow only for the incremental data uploaded daily.

This approach solved my performance issues and made the refresh process much faster.

What I did was create two dataflows:

What you did was to create a manual process for partition management.  You could have done the same in the semantic model directly, without the dataflows.

 

What you have now is an incremental refresh on top of an incremental refresh.  That can be optimized.

@lbendlin , thanks for the input! While I agree that standard Incremental Refresh is usually the way to go, the bottleneck here is the SharePoint Folder connector, which does not support Query Folding.

If I relied on the native Incremental Refresh in the Semantic Model, the mashup engine would still likely scan the metadata of all files to apply the date filters, causing the same I/O bottleneck.
By separating the Dataflows effectively, I force the engine to completely ignore the historical files during the daily refresh, which creates a 'hard partition' that SharePoint can't provide natively.

Ideally, I would use a SQL DB with folding, but given the constraints, this manual partitioning was the only way to bypass the scanning overhead.

anilelmastasi
Super User
Super User

Hello @monikapatil2612 ,

 

Incremental refresh requires a data source that supports query folding. However Python data sources never fold, so Power BI cannot push date filters to Odoo. It must re-download the entire dataset and this causes exceeding API limits.

 

My recommendation is using a staging database. Can you create a python script to load data into any Database of your company use?

Yes I can pyhton script to load data to database.Can you guide me on which database should I select and what is best option among different cloud to store python script and databse ?

Hello @my_patil ,

 

If you company use Microsoft products, you can use Azure SQL Database. 

And for python script you can use Azure Functions. It runs on a schedule and it is serverless (no virtual machine to manage).

 

After writing to Azure SQL Database, you can apply incremental refresh policies.

In Azure SQL DB, create table and add updated_at and write_date columns.

In python script, you can use xmlrpc.client library.

 

Hope it gives you an idea.

 

Thank you very much for your help! Your guidence is really useful to me.
I have proposed as you said to manager. I am waiting for his response on this.

You are welcome 🙂 

Incremental refresh requires a data source that supports query folding

Not entirely accurate.  It prefers that the query folds, but it does not strictly require it.  Non-folding sources will not provide any benefit to the source and network, but this will not impact the partition management.

If I do not remember wrong, I read something like this 1.5-2 years ago. When I check documentation now, yes, non-folding queries are acceptable but not recommended. 

 
 query-folding-warning.png

 

However, query folding for life 🙂 

If I do not remember wrong, I read something like this 1.5-2 years ago. When I check documentation now, yes, non-folding queries are acceptable but not recommended. 

 
 query-folding-warning.png

 

However, query folding for life 🙂 

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.