Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
dan_blyth_1
Regular Visitor

Looking for a Limited API solution - INSIGHTLY API

Hi All, 

 

Thank you for taking the time to review this post; 

 

To give some background on the issue with which I am facing; I am working with the INSIGHTLY API - there are 3 versions of the API each with subtle differences - such as the availability of certain fields in v.2.2/2.3 which do not exist in 2.1. Critically, 2.2 & 2.3 have restrictions on the number of rows (500 max) which can be retrieved in making the call. I have managed to get around this and I am able to pull all records from the required views using a method I found and modified which uses the following function;

 

this is the call made to the API within which I am using a reccursive method to pull 500 rows, the pull another 500 skipping the previous 500. This is explained as clearly as possible within the code and the 4 screenshots below;

 

M Code of Tasks Connection - Skip & Offset.png

Notice above within the URL "skip=" & OffSet & "&top=500"))" in conjunction with the table below. The call pulls 500 records from each offset point.
Source Table - Offset Records.png

Notice above here we have intervals amounting to the maximum number of rows the api will allow.

 

Records Populating with Offset.png

Sets of 500 records are retrieved and upon expanding the columns, the desired result is returned in the below screenshot; to tidy this up I could delete the offset column once expanded but this is not really important at this stage;

 Expanded Tasks.png

 

 

The issue I have is that this is not future proof. As I know how many records exist within the table (based on V2.1 of the API which is unlimited) I am able to set my intervals out to 11001 in buckets of 500 and all records are retrieved; note the formula in the second screenshot above; 

 

"= Table.TransformColumnTypes(Table.FromRows({{"0"},{"501"},{"1001"},{"1501"},{"2001"},{"2501"},{"3001"},{"3501"},{"4001"},{"4501"},{"5001"},{"5501"},{"6001"},{"6501"},{"7001"},{"7501"},{"8001"},{"8501"},{"9001"},{"9501"},{"10001"},{"10501"},{"11001"}},{"OffSet"}) ,{{"OffSet", type text}})"

 

When the records surpass 11501, I will need to manually add a new upper bracket to the list in order to continue retrieving all rows within the table; the solution I am therefore looking to implement is within the function above as pasted in italics - is there a method to have this table populate in buckets of 500 (or other increment) but to an upper limit which can be in someway calculated based on a count of records within another table. For instance if I have a separate table performing a count of all records from API 2.1, and the result of this is 11750 (for example), could the above function be amended to take that max row count and create the table accordingly - i.e. without manually adding a new upper bucket manually?

 

Just for clarity, the solution I have works perfectly for the time being - using the manually assigned intervals - in that I am able to manually update, retrieve all records etc - the issue is going to come when I have to hand this to a less experienced end user to maintain, automating this piece will be a big win. I am also aware of others who have experienced similar issues both with this particular API and other similar.

 

Any help would be greatly appreciated, 

 

Thanks.

5 REPLIES 5
Anonymous
Not applicable

@ImkeF, any ideas about this issue?


Regards,
Lydia

ImkeF
Community Champion
Community Champion

You can use List.Generate to generate a nested list of lists like this:

 

(MaxValue as number, StartValue as number, Increment as number) =>

List.Generate(  ()=>[x = StartValue, Counter = 0], 
                each [Counter] <= Number.RoundUp(MaxValue/Increment), 
                each [x = [x]+Increment, Counter = [Counter]+1], 
                each {[x]})

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Hi @ImkeF @Anonymous Thanks for your help and responses to this post however the solution offered does not really fit the issue - as far as I can tell, to be clear, I am ok with manually setting the start point (0) and increment will always be 500 (as this is the max rows retrieved per call from the API, what will be variable is the end point, hence I need this to be calculated based on either a value which can be retrieved from another table, or a calculation within the function. This is why I suggested a rounded max row count could be implemented - rounded up to the nearest 500 - i.e. if we perform a count of rows in another table and the result be 11,349, this would round up to 11,500. The solution provided is valid but merely replicates the one I already have, and would still require manually setting the upper limit. 

 

Please let me know if you need any further explanation. 

 

Thanks.

Yes, I didn't fully read your description, as it contians too much irrelevant information, that costs too much time to read. (Normally I would not pic sth like this up)

 

The function to retrieve the desired value from a Table is this:

 

(Table as table, Increment as number) =>

Number.RoundUp(Table.RowCount(Table)/Increment)*Increment

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

@ImkeF Thanks I will give it a go.

 

I will not defend my initial post only to say that far too many threads on here are vauge resulting in solutions being suggested which are either not relevant, have been mis-interpretted or simply do not solve the issue being faced. My post may have been "detailed" (IMO), at least it gave the background and the reasons why I need(ed) such a solution. 

 

Many thanks for your help. 

 

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.