Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
Hello,
We are using a custom application and only way to extract data from it is using odbc driver which has a 1mil limit per request. currently what I am doing is creating a dataflow with each query @ 1mil rows and appending them together. The issue with this is when I refresh it will always load the entire 1mil rows instead of just updating the latest data.
Another issue is lets say I have 10mil rows I need to create 10 separate 1mil Blocks.
would love your suggestions on how we can best implement this on powerBI.
Thank You
Hi @sun-sboyanapall ,
Q1. I have an idea but haven't tested it yet. However, this approach may affect performance. The steps are similar to yours, filter the table and import it into PQ, and eventually appending the table in Power Query Editor.
Q2. If this dataflow resides in a workspace in Premium capacity, you can configure incremental refreshes for this data flow, it will load the latest data into the dataflow according to the rules.
Please refer here.
Using incremental refresh with dataflows
Best Regards,
Community Support Team_Gao
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly. If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Ho to get yur questions answered quickly -- How to provide sample data
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.