Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I have been following this guide (crmchartguy.com/2017/09/30/use-fetchxml-in-power-bi-with-dynamics-365-customer-engagement/) to import data from my Dynamics dashboard into the Power BI desktop app. However, there is a limitation of importing 5000 rows at a time. I was wondering if there is an optimal way to work around this and import say 20000 rows from dynamics into one table in Power BI that can be auto-refreshed overtime? I would appreciate a point in the right direction.
This page described the paging cookie.
Page large result sets with FetchXML (Microsoft Dataverse) - Power Apps | Microsoft Docs
Also, if it refreshes in the desktop, you should be able to refresh it in the service. However, you'll likely need to use the relative path syntax for the Web.Contents function.
Pat
To learn more about Power BI, follow me on Twitter or subscribe on YouTube.
20,000 records shouldn't be too bad with any of the connectors (OData, Dynamics, Dataverse). Have you tried those? How long does refresh take?
In any case, it looks like you can use a skiptoken with fetchXML. This article shows that with Power Automate, so it should be possible from Power Query too. You should be able to make a list of number values (as text) to be concatenated into your fetchXML calls (or function).
Pat
To learn more about Power BI, follow me on Twitter or subscribe on YouTube.
Hi @mahoneypat,
Thank you for the suggestion. I forgot to add that I did try that option and the issue I ran into was that it was a lot of trouble to re-create the views that I am easily able to see in Dynamics. For example, seeing cases in Dynamics by a certain topic etc. However, the guide in the link from my original post helped me eliminate this issue. The problem is that I have to deal with that 5000 row limitation. I have already gone through the trouble of breaking down my 20000 rows into individual 5000 row tables so that I could use that method. The issue there would be, how would I merge, refresh and append to that data without any duplicates or overlap? I am still open to alternative solutions though.
So you are pulling the 20k rows into 4 queries? What is the difference between? You should be able to start with a table with the 4 different values, and then pass those into the fetchXML call, to keep it in one table/query.
Pat
To learn more about Power BI, follow me on Twitter or subscribe on YouTube.
@mahoneypat It is actually a little over 20000 and Thank you for the suggestion. So I made five seperate tables to pull in all data within the 5000 row limit and then I merged them into one table. There are no gaps between them as I counted row for row by setting up date ranges. I was thinking I could set up a daily or weekly view in Dynamics and use that in my Power BI desktop which could run/refresh everyday and automatically append to this larger 20000+ row table and check for duplicates by one of the columns. Getting everything to work in the Power BI desktop is one thing, however, I am also looking to have this work in the cloud (a workspace) after publishing it so I can share it with others. Could you elaborate on what you mean by using a fetchXML call to keep it all in one query?
For your Closed Block 1,2,3, etc. queries, what is different about them? Are you using a skiptoken? Or some other way of getting the next 5000 records? Whatever it is, you likely can start with a table of those values and then concatenate them into your call.
Can you share two of your queries? Or at least the part that shows the fetchXML calls?
Pat
To learn more about Power BI, follow me on Twitter or subscribe on YouTube.
Hi @mahoneypat ,
The difference between the queries (Closed block 1, 2 etc) are the date ranges of the query. I don't think I am using a "skiptoken" exactly. I just specified the date ranges in each query/block so that they would be slightly under 5000 rows. For example, the condition reads in my XML query for Block 2 as:
<condition attribute="receivedon" operator="on-or-after" value="2015-01-30" />
<condition attribute="receivedon" operator="on-or-before" value="2017-04-11" />
---
In addition, I see one column with numbers I could potentially use to identify duplicates named value.@odata.etag. I followed the guide at crmchartguy.com/2017/09/30/use-fetchxml-in-power-bi-with-dynamics-365-customer-engagement/ to pull the data in after creating the views manually in Advanced Find through the Dynamics Dashboard with below 5k rows.
Let me know if anything doesn't make sense.
Without a Dynamics instance to try stuff out on, I can't provide the specific syntax I'd like to. I see two potential paths.
One is to make a list of numbers in increments of 5000 and pass those into Web.Contents with the OData url with $top and $skip parameters similar to the below syntax. With less than 25k rows, refresh shouldn't take long.
https://*****.operations.dynamics.com/data/CustomersV3?$skip=10000&$top=10000
This video shows how to do similar with another REST API.
Power BI - Tales From The Front - REST APIs - YouTube
For the fetchXML approach, you can either generate or create a table of your date range values as text and pass those into the web calls. Or try to do similar with the paging cookie syntax.
Pat
To learn more about Power BI, follow me on Twitter or subscribe on YouTube.
Hi @mahoneypat ,
Thank you for the response and noted. Can you clarify what you mean by the paging cookie syntax? Also I am wondering if the process I followed to merge the tables within Power BI desktop is already a potential solution. Do you think (After publishing) a Power BI workspace would still refresh/automatically update tables from a Web API source even after they are appended? I may need to test this on my end to verify but would appreciate your input on that as well.
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!