Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.
Solved! Go to Solution.
HI @JeanY1,
To effectively retrieve paginated data from the Degreed API using Power Query, it's necessary to understand that Degreed uses a cursor-based pagination system, where the response includes a links.next URL pointing to the next page of data.
A common issue in implementations is failing to manage rate limits, data accumulation, and error handling properly. To comply with Degreed’s rate limit of 70 requests per minute, it’s best to introduce a delay of at least one second before each API call, ensuring you're throttling requests proactively rather than after a call.
Then, Power Query’s List.Generate function should be carefully structured to retain and combine all responses across pages, as poorly managed iterations can result in partial data loads. It's also critical to implement retry logic, particularly for handling HTTP 429 errors due to rate limits.
This involves automatically reattempting failed requests a few times before aborting, enhancing the reliability of the data pull. By restructuring the logic to fetch one page at a time with a delay, follow the links.next cursor accurately, combine results across iterations, and handle transient errors gracefully, you can build a robust Power Query solution for extracting all data from the Degreed API.
Thanks for reaching out! If this answer was helpful, please consider marking it as Accepted Solution and giving a Kudos, it helps the community!
Thank you.
HI @JeanY1,
To effectively retrieve paginated data from the Degreed API using Power Query, it's necessary to understand that Degreed uses a cursor-based pagination system, where the response includes a links.next URL pointing to the next page of data.
A common issue in implementations is failing to manage rate limits, data accumulation, and error handling properly. To comply with Degreed’s rate limit of 70 requests per minute, it’s best to introduce a delay of at least one second before each API call, ensuring you're throttling requests proactively rather than after a call.
Then, Power Query’s List.Generate function should be carefully structured to retain and combine all responses across pages, as poorly managed iterations can result in partial data loads. It's also critical to implement retry logic, particularly for handling HTTP 429 errors due to rate limits.
This involves automatically reattempting failed requests a few times before aborting, enhancing the reliability of the data pull. By restructuring the logic to fetch one page at a time with a delay, follow the links.next cursor accurately, combine results across iterations, and handle transient errors gracefully, you can build a robust Power Query solution for extracting all data from the Degreed API.
Thanks for reaching out! If this answer was helpful, please consider marking it as Accepted Solution and giving a Kudos, it helps the community!
Thank you.
HI @JeanY1,
Since we haven't heard back from you yet, I'd like to confirm if you've successfully resolved this issue or if you need further help?
If you've already resolved the issue, you can mark the helpful reply as a "solution" so others know that the question has been answered and help other people in the community. Thank you again for your cooperation!
If you still have any questions or need more support, please feel free to let us know. We are more than happy to continue to help you.
Hi @JeanY1,
we haven't heard back from you regarding our last response and wanted to check if your issue has been resolved.
If our response addressed by the community member for your query, please mark it as Accept Answer and give us Kudos. Should you have any further questions, feel free to reach out.
Thank you for being a part of the Microsoft Fabric Community Forum!
Hi @JeanY1,
May I ask if you have gotten this issue resolved?
If it is solved, please mark the helpful reply or share your solution and accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.
Thank you.
Hi @JeanY1,
Thank you for raising this concern. I also appreciate @kushanNa for sharing a helpful reference link.
To assist you better, could you please confirm the specific API you are working with? Since APIs can vary in how they handle pagination, knowing the exact one will help us provide more accurate guidance.
Based on your description and the structure of the response (with links and data), it appears your API uses a typical RESTful pagination model. Also, kindly note that the rate-limiting logic in your current setup may not be effectively controlling the request frequency, which could be causing the stall with large datasets. Once we confirm the API and review your implementation approach, we can help refine the logic to ensure stable performance across all table sizes.
Looking forward to your response.
If this solution worked for you, kindly mark it as Accept as Solution and feel free to give a Kudos, it would be much appreciated!
Thank you.
I set the rate limit in the code to less than the required rate limit to make sure it wasn't an issue. What keeps happening is that multiple calls are successful but then somewhere in the middle of the calls it becomes unsuccessful.
Hi @JeanY1
What is your API ? check if this helps https://community.fabric.microsoft.com/t5/Desktop/Rest-API-Json-several-pages-automatically-call-the...