Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
trevorgermain
Advocate I
Advocate I

Too many requests in Web.Contents. Can I share a request?

Hi,

 

I am querying a web api using Web.Contents in a loop to get paged data. The Api has a throttled request limit of 300 requests per 10 minutes.

 

 

{
    "results": {
        "timesheets": {
            "123456": {
                "id": "123456",
                "job_code": "0001",
                "user": "9999"
            }
        }
    },
    "more": "true",
    "supplemental_data": {
        "job_codes": {
            "0001": {
                "name": "Job 1"
            }
        },
        "users": {
            "9999": {
                "name": "Trevor"
            }
        }
    }
}

I use List.Generate, combined with Web.Contents to get all of the data for a specific date range, using the "more" value to determine if more pages are available.

 

To model, The "timesheets" records become my facts, and the supplemental data becomes my dimensions. The main query creates my fact table, then I reference that query and transform to extract my users and jobcodes dimensions.

 

By referencing the main query, I wind up making the paged Api calls 3x (one for each table). Ideally, I'd like to have the Api called only once, and populate my 3 tables without having to go back to the source. Is there any way to "cache" the Api requests, and reuse them as the source for multiple queries?

 

1 ACCEPTED SOLUTION

I ended up writing a web api as an intermediary that caches responses in a Redis cache. It seems to work really well for this scenario.

View solution in original post

5 REPLIES 5
v-ljerr-msft
Employee
Employee

Hi @trevorgermain,

By referencing the main query, I wind up making the paged Api calls 3x (one for each table). Ideally, I'd like to have the Api called only once, and populate my 3 tables without having to go back to the source. Is there any way to "cache" the Api requests, and reuse them as the source for multiple queries?

As far as I know, there isn't a way to do it currently.

 

Here is a good article about Referenced Queries And Caching In Power BI And Power Query, which could help you better understand this issue. Smiley Happy

 

Regards

I ended up writing a web api as an intermediary that caches responses in a Redis cache. It seems to work really well for this scenario.

Anonymous
Not applicable

Hello @trevorgermain,

 

I am struggling with the same, but my API endpoint throws back a 500 response for some reason, and that kills my refresh. Could you share the code as well? Is this solution working with scheduled refresh in the Service?

 

Br, 

Szabolcs

As it turns out, this didn't work in the service at all. I don't think the service supports Web Api very well. It works in Desktop, though. I've actually pulled this out of Power BI and moved it to Excel with Power Query / Pivot Tables to get the desired result. Frankly, I didn't want the customer to have to use Power BI desktop. It's a SharePoint hosted solution that would have been really slick with the Power BI web part, but it won't work with the Web Api.

 

As for your 500 error, try running Fiddler while making the request, you might be able to see more information about what the actual error is. If it seems to be a legitimate error on the Api, you may have to contact your Api provider for support.

Hi @trevorgermain,

 

Great to hear the problem got resolved! Could you accept your reply above as solution to close this thread(which could also help others who have similar issue easily find the answer)? Smiley Happy

 

Regards

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.