Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
JeanY1
Regular Visitor

Retrieving paginated JSON in rate limited API

Please help... I am trying to retrive paginated data from an API that for each call returns an object that contains two objects: one is "links" that contains a "self" link and a "next" link, and the other is the data. I need to keep calling the "next" link and appending the data to the previously retrieved data. I need to do this for all tables in the database. Some of the tables have hundreds of thousands of lines and some have only a few hundred. However I run into the issue that the large tables stall out. The rate limit is 70. Thanks in advance
 
 
let
    auth_token = "ABCDEFG",
    hed = [Authorization = "Bearer " & auth_token],
    url = "https://www.test.com",

 

    GetPage = (url as text) as record =>
        let
            response = Json.Document(Web.Contents(url, [Headers = hed])),
            _ = Function.InvokeAfter(() => response, #duration(0, 0, 0, 1/60)), // Rate limit of 60 requests per second
            headers = Web.Contents(url, [Headers = hed, ManualStatusHandling={200}])[Headers]
        in
            [response = response, headers = headers],

 

    initialResponse = GetPage(url),
    total_response = initialResponse[response][data],

 

    GetAllPages = List.Generate(
        () => [response = initialResponse[response], nextLink = try if Record.HasFields(initialResponse[response], "links") then initialResponse[response][links][next] else null otherwise null],
        each [nextLink] <> null,
        each [
            pageData = GetPage([nextLink]),
            response = pageData[response],
            nextLink = try if Record.HasFields(response, "links") then response[links][next] else null otherwise null,
            total_response = List.Combine({total_response, response[data]}),
            headers = pageData[headers]
        ],
        each [response]
    ),

 

    finalData = List.Last(GetAllPages)[total_response]
in
    finalData

1 ACCEPTED SOLUTION
v-sgandrathi
Community Support
Community Support

HI @JeanY1,

 

To effectively retrieve paginated data from the Degreed API using Power Query, it's necessary to understand that Degreed uses a cursor-based pagination system, where the response includes a links.next URL pointing to the next page of data.

A common issue in implementations is failing to manage rate limits, data accumulation, and error handling properly. To comply with Degreed’s rate limit of 70 requests per minute, it’s best to introduce a delay of at least one second before each API call, ensuring you're throttling requests proactively rather than after a call.

Then, Power Query’s List.Generate function should be carefully structured to retain and combine all responses across pages, as poorly managed iterations can result in partial data loads. It's also critical to implement retry logic, particularly for handling HTTP 429 errors due to rate limits.

This involves automatically reattempting failed requests a few times before aborting, enhancing the reliability of the data pull. By restructuring the logic to fetch one page at a time with a delay, follow the links.next cursor accurately, combine results across iterations, and handle transient errors gracefully, you can build a robust Power Query solution for extracting all data from the Degreed API.

 

Thanks for reaching out! If this answer was helpful, please consider marking it as Accepted Solution and giving a Kudos, it helps the community!
Thank you.

View solution in original post

9 REPLIES 9
v-sgandrathi
Community Support
Community Support

HI @JeanY1,

 

To effectively retrieve paginated data from the Degreed API using Power Query, it's necessary to understand that Degreed uses a cursor-based pagination system, where the response includes a links.next URL pointing to the next page of data.

A common issue in implementations is failing to manage rate limits, data accumulation, and error handling properly. To comply with Degreed’s rate limit of 70 requests per minute, it’s best to introduce a delay of at least one second before each API call, ensuring you're throttling requests proactively rather than after a call.

Then, Power Query’s List.Generate function should be carefully structured to retain and combine all responses across pages, as poorly managed iterations can result in partial data loads. It's also critical to implement retry logic, particularly for handling HTTP 429 errors due to rate limits.

This involves automatically reattempting failed requests a few times before aborting, enhancing the reliability of the data pull. By restructuring the logic to fetch one page at a time with a delay, follow the links.next cursor accurately, combine results across iterations, and handle transient errors gracefully, you can build a robust Power Query solution for extracting all data from the Degreed API.

 

Thanks for reaching out! If this answer was helpful, please consider marking it as Accepted Solution and giving a Kudos, it helps the community!
Thank you.

HI @JeanY1,

 

Since we haven't heard back from you yet, I'd like to confirm if you've successfully resolved this issue or if you need further help?
If you've already resolved the issue, you can mark the helpful reply as a "solution" so others know that the question has been answered and help other people in the community. Thank you again for your cooperation!
If you still have any questions or need more support, please feel free to let us know. We are more than happy to continue to help you.

Hi @JeanY1,


we haven't heard back from you regarding our last response and wanted to check if your issue has been resolved.

If our response addressed by the community member for  your query, please mark it as Accept Answer and give us Kudos. Should you have any further questions, feel free to reach out.
Thank you for being a part of the Microsoft Fabric Community Forum!

Hi @JeanY1,

 

May I ask if you have gotten this issue resolved?

If it is solved, please mark the helpful reply or share your solution and accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.

 

Thank you.

v-sgandrathi
Community Support
Community Support

Hi @JeanY1,

 

Thank you for raising this concern. I also appreciate @kushanNa  for sharing a helpful reference link.

To assist you better, could you please confirm the specific API you are working with? Since APIs can vary in how they handle pagination, knowing the exact one will help us provide more accurate guidance.

 

Based on your description and the structure of the response (with links and data), it appears your API uses a typical RESTful pagination model. Also, kindly note that the rate-limiting logic in your current setup may not be effectively controlling the request frequency, which could be causing the stall with large datasets. Once we confirm the API and review your implementation approach, we can help refine the logic to ensure stable performance across all table sizes.

Looking forward to your response.

 

If this solution worked for you, kindly mark it as Accept as Solution and feel free to give a Kudos, it would be much appreciated!

 

Thank you.

It's Degreed. Here is the developer docs page on Pagination

I set the rate limit in the code to less than the required rate limit to make sure it wasn't an issue. What keeps happening is that multiple calls are successful but then somewhere in the middle of the calls it becomes unsuccessful. 

kushanNa
Super User
Super User

It's Degreed. Here is the developer docs page on Pagination

Helpful resources

Announcements
September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors