Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Atul0215
Frequent Visitor

How to Fetch 11M+ Rows from OData Entity in Power BI with Incremental Refresh?

I have a task where I need to fetch data from an OData entity with over 11 million rows and multiple columns into Power BI (either through Power BI Desktop or Fabric). However, I do not have SQL access, and I am unable to load the entire dataset due to performance or time limitations.

I’m looking for ways to efficiently load this data and implement incremental refresh to optimize performance. Since I only have the OData URL and no access to the underlying SQL, can you suggest any best practices or methods to handle large datasets like this in Power BI?

Thank you in advance for your help.

1 ACCEPTED SOLUTION

Hi @Anonymous ,

Since our database is very large, we plan to initially load all historical data up to the last 6 months. Going forward, we will enable incremental refresh to automatically load only new and updated data.

View solution in original post

10 REPLIES 10
Anonymous
Not applicable

Hi @Atul0215 ,
As we have not received a response from you yet, I would like to confirm whether you have successfully resolved the issue or if you require further assistance.If it solved, please accept as solution.

Thank you for being a valued member of the Microsoft Fabric Community Forum!

Anonymous
Not applicable

Hi @Atul0215 ,

May I ask if you have gotten this issue resolved?
If it is solved, please mark the helpful reply or share your solution and accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.
Thank you for being a part of the Microsoft Fabric Community Forum!

Hi @Anonymous ,

Since our database is very large, we plan to initially load all historical data up to the last 6 months. Going forward, we will enable incremental refresh to automatically load only new and updated data.

Anonymous
Not applicable

Hi @Atul0215 ,
Thank you for updating!
If our response addressed your query, please mark that answer as Accept as solution. It help others in the community to find it easily.

Thank you for being a part of the Microsoft Fabric Community Forum!

Regards,
Pallavi.

Anonymous
Not applicable

Hi @Atul0215 ,
I wanted to follow up on our previous suggestions regarding Bring Your Own Database approach. We would like to hear back from you to ensure we can assist you further.
If our response has addressed your query, please accept it as a solution and give a ‘Kudos’ so other members can easily find it. 
Thank you for being a valued member of the Microsoft Fabric Community Forum!

Regards,
Pallavi.

Anonymous
Not applicable

Hi @Atul0215 ,

Thank you for reaching out to the Microsoft Fabric Community Forum about the issue you are encountering.

Thank you for the helpful response @Akash_Varuna. As Akash suggested, the provided answer is accurate and should help resolve the issue. Additionally refer the Odata Feed documentation . If it works for you, please consider giving Kudos and marking it as the accepted solution to help others find it more easily.
If still facing difficulty in resolving the issue, please feel free to reachout.

Regards,
Pallavi.

 

 

Hello @Anonymous 

Thank you for your response. I have already tried the solutions mentioned by @Akash_Varuna , but I am still unable to refresh or load the data, even in Dataflows. I only need to fetch a single entity that contains a very large amount of data.

Regarding incremental refresh, my understanding is that it only retrieves new or changed data, whereas I need the complete historical dataset, which is quite large.

I am now exploring the BYOD (Bring Your Own Database) solution to handle this large dataset. Would this be a good approach in my case?

Any further guidance would be greatly appreciated.

Hi @Atul0215 Bring Your Own Database is a good approach but please do look care of the cost and hosting of the database and according to the budget 
If this post helped please do give a kudos and accept this as a solution
Thanks In Advance

Akash_Varuna
Super User
Super User

Hi @Atul0215 , You could use Dataflows and enable incrimental refresh could you please try these out

  • Enable Incremental Refresh: Use RangeStart and RangeEnd parameters in Power Query to filter data by date, then configure incremental refresh in Power BI Service.
  • Ensure Query Folding: Verify that filtering steps fold back to the OData source to optimize performance.
  • Optimize Query: Limit columns using Table.SelectColumns and apply pagination if supported with $top and $skip.
  • Use Dataflows: Set up incremental refresh in a Power BI Dataflow for reusability and efficiency.
    If this post helped please do give a kudos and accept this as a solution
    Thanks In Advance

Hello @Akash_Varuna ,

Thank you for your suggestions! I have tried the solutions you mentioned, but I am still unable to refresh or load the data, even in Dataflows. The entity I am working with contains a very large amount of data, and I need the complete historical dataset rather than just incremental changes.

To overcome the OData limitations, I am now exploring the BYOD (Bring Your Own Database) solution. Do you think this would be a better approach for handling large datasets in Power BI? Any insights or recommendations would be greatly appreciated!

Thanks again for your help.

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.