Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I have a task where I need to fetch data from an OData entity with over 11 million rows and multiple columns into Power BI (either through Power BI Desktop or Fabric). However, I do not have SQL access, and I am unable to load the entire dataset due to performance or time limitations.
I’m looking for ways to efficiently load this data and implement incremental refresh to optimize performance. Since I only have the OData URL and no access to the underlying SQL, can you suggest any best practices or methods to handle large datasets like this in Power BI?
Thank you in advance for your help.
Solved! Go to Solution.
Hi @Anonymous ,
Since our database is very large, we plan to initially load all historical data up to the last 6 months. Going forward, we will enable incremental refresh to automatically load only new and updated data.
Hi @Atul0215 ,
As we have not received a response from you yet, I would like to confirm whether you have successfully resolved the issue or if you require further assistance.If it solved, please accept as solution.
Thank you for being a valued member of the Microsoft Fabric Community Forum!
Hi @Atul0215 ,
May I ask if you have gotten this issue resolved?
If it is solved, please mark the helpful reply or share your solution and accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.
Thank you for being a part of the Microsoft Fabric Community Forum!
Hi @Anonymous ,
Since our database is very large, we plan to initially load all historical data up to the last 6 months. Going forward, we will enable incremental refresh to automatically load only new and updated data.
Hi @Atul0215 ,
Thank you for updating!
If our response addressed your query, please mark that answer as Accept as solution. It help others in the community to find it easily.
Thank you for being a part of the Microsoft Fabric Community Forum!
Regards,
Pallavi.
Hi @Atul0215 ,
I wanted to follow up on our previous suggestions regarding Bring Your Own Database approach. We would like to hear back from you to ensure we can assist you further.
If our response has addressed your query, please accept it as a solution and give a ‘Kudos’ so other members can easily find it.
Thank you for being a valued member of the Microsoft Fabric Community Forum!
Regards,
Pallavi.
Hi @Atul0215 ,
Thank you for reaching out to the Microsoft Fabric Community Forum about the issue you are encountering.
Thank you for the helpful response @Akash_Varuna. As Akash suggested, the provided answer is accurate and should help resolve the issue. Additionally refer the Odata Feed documentation . If it works for you, please consider giving Kudos and marking it as the accepted solution to help others find it more easily.
If still facing difficulty in resolving the issue, please feel free to reachout.
Regards,
Pallavi.
Hello @Anonymous
Thank you for your response. I have already tried the solutions mentioned by @Akash_Varuna , but I am still unable to refresh or load the data, even in Dataflows. I only need to fetch a single entity that contains a very large amount of data.
Regarding incremental refresh, my understanding is that it only retrieves new or changed data, whereas I need the complete historical dataset, which is quite large.
I am now exploring the BYOD (Bring Your Own Database) solution to handle this large dataset. Would this be a good approach in my case?
Any further guidance would be greatly appreciated.
Hi @Atul0215 Bring Your Own Database is a good approach but please do look care of the cost and hosting of the database and according to the budget
If this post helped please do give a kudos and accept this as a solution
Thanks In Advance
Hi @Atul0215 , You could use Dataflows and enable incrimental refresh could you please try these out
Hello @Akash_Varuna ,
Thank you for your suggestions! I have tried the solutions you mentioned, but I am still unable to refresh or load the data, even in Dataflows. The entity I am working with contains a very large amount of data, and I need the complete historical dataset rather than just incremental changes.
To overcome the OData limitations, I am now exploring the BYOD (Bring Your Own Database) solution. Do you think this would be a better approach for handling large datasets in Power BI? Any insights or recommendations would be greatly appreciated!
Thanks again for your help.
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
| User | Count |
|---|---|
| 10 | |
| 9 | |
| 6 | |
| 5 | |
| 3 |