Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
I have tried for hours to get this recursive formula to work. But at best it only pulls two pages of the 15+
Can anyone see what i am doing wrong?
GetPages = (url as text, accumulatedData as list) as list =>
let
Source = Json.Document(Web.Contents(url, [Headers=[Authorization="Bearer " & token, Accept="application/json"]])),
data = try Source[data] otherwise null,
nextToken = try Source[nextToken] otherwise null,
newData = if data <> null then accumulatedData & {data} else accumulatedData,
result1 = if nextToken <> null then @GetPages(url & "&nextToken=" & nextToken, newData) else List.Combine(newData)
in
result1,
Solved! Go to Solution.
Can anyone see what i am doing wrong?
You are using a recursive function 🙂 . They are cute but mostly useless especially when you have lots of data that you have to lug around in the bowels of the recursion. It is much better to use List.Generate if possible and List.Accumulate if necessary, with a healthy helping from your browser cache. Or, if you want to be really clever and the scenario supports it (which yours unfortunately doesn't) you can spread out the API call into multiple queries for maximum parallelism and speed.
Here's an article that may be of help Handling paging for Power Query connectors - Power Query | Microsoft Learn
actually good news. I've got it pulling the data now. However, it isn't loading the last page. i think once the token goes null it takes that as a cue to move on and doesn't load the last data pull. Any idea on how to ensure it adds the final page?
let
// API Credentials
oauthSite = "https://<Redacted>.docebosaas.com/oauth2/token",
clientId = "<Redacted>",
clientSecret = "<Redacted>",
// User Credentials
email = "<Redacted>",
pwd = "<Redacted>",
// Assemble Token URL
tokenURL = "client_id=" & clientId & "&" & "client_secret=" & clientSecret & "&" & "grant_type=password&scope=api&username=" & email & "&" & "password=" & pwd,
binaryTokenURL = Text.ToBinary(tokenURL),
// Call for Token
tokenData = Json.Document(Web.Contents(oauthSite, [Headers = [#"Content-Type"="application/x-www-form-urlencoded"], Content=binaryTokenURL])),
// Extract the token from the returned table
token = tokenData[access_token],
site = "https://<Redacted>.docebosaas.com",
ver = "/analytics/v1/",
page = "reports/",
idReport = "<Redacted>",
csv = "export/csv",
limit = 1000, // Adjust the limit based on your requirement
endPoint = site & ver & page & idReport & csv,
// Retrieve Execution ID using Token
Source = Json.Document(Web.Contents(endPoint, [Headers=[Authorization="Bearer " & token, Accept="application/json"]])),
data0 = Source[data],
executionId = data0[executionId],
endpoint2 = site & ver & page & idReport & "exports/" & executionId & "/results" & "?pageSize=" & Text.From(limit),
// Retrieve Paginated Data
GetPages = (url as text) as list =>
let
GetPageData = (url as text) =>
let
SourceGet = Json.Document(Web.Contents(url, [Headers=[Authorization="Bearer " & token, Accept="application/json"]])),
dataGet = try SourceGet[data] otherwise null,
nextTokenGet = try SourceGet[nextToken] otherwise null,
Encoded = try Uri.EscapeDataString(nextTokenGet) otherwise null
in
[Data = dataGet, NextToken = Encoded],
initialData = GetPageData(url),
accumulatedData = List.Generate(
() => initialData,
each [NextToken] <> null,
each GetPageData(url & "&nextToken=" & [NextToken]),
each [Data]
),
result = List.Combine(accumulatedData)
in
result,
GetPagesDelay = (url as text) => Function.InvokeAfter(() => GetPages(url), #duration(0, 0, 0, 15)),
DataSet = GetPagesDelay(endpoint2),
in
DataSet
Thanks, took your approach below, but am still getting an error saying that it can't find the nextToken on the record eventhough i've confirmed it is there (at least for the first few pages). Any thoughts?
GetPages = (url as text) as list =>
let
GetPageData = (url as text) =>
let
Source = Json.Document(Web.Contents(url, [Headers=[Authorization="Bearer " & token, Accept="application/json"]])),
data = try Source[data] otherwise null,
nextToken = try Source[nextToken] otherwise null
in
[Data = data, NextToken = nextToken],
initialData = GetPageData(url),
accumulatedData = List.Generate(
() => initialData,
each [NextToken] <> null,
each GetPageData(url & "&nextToken=" & Text.From([NextToken])),
each [Data]
),
result = List.Combine(accumulatedData)
in
result,
There's no necessity for recursive function in your case since the count of loop is fixed (=counts of tokens).
Try List.accumulate().
Expertise = List.Accumulate( {Days as from Today}, {Skills and Knowledge}, (Current, Everyday) => Current & Day.LeanAndPractise(Everyday) ) |
Can anyone see what i am doing wrong?
You are using a recursive function 🙂 . They are cute but mostly useless especially when you have lots of data that you have to lug around in the bowels of the recursion. It is much better to use List.Generate if possible and List.Accumulate if necessary, with a healthy helping from your browser cache. Or, if you want to be really clever and the scenario supports it (which yours unfortunately doesn't) you can spread out the API call into multiple queries for maximum parallelism and speed.
Here's an article that may be of help Handling paging for Power Query connectors - Power Query | Microsoft Learn
Thanks, took your approach below, but am still getting an error saying that it can't find the nextToken on the record eventhough i've confirmed it is there (at least for the first few pages). Any thoughts?
GetPages = (url as text) as list =>
let
GetPageData = (url as text) =>
let
Source = Json.Document(Web.Contents(url, [Headers=[Authorization="Bearer " & token, Accept="application/json"]])),
data = try Source[data] otherwise null,
nextToken = try Source[nextToken] otherwise null
in
[Data = data, NextToken = nextToken],
initialData = GetPageData(url),
accumulatedData = List.Generate(
() => initialData,
each [NextToken] <> null,
each GetPageData(url & "&nextToken=" & Text.From([NextToken])),
each [Data]
),
result = List.Combine(accumulatedData)
in
result,
Provide more details on the URL and some sample snapshot of what JSON it usually returns.
actually good news. I've got it pulling the data now. However, it isn't loading the last page. i think once the token goes null it takes that as a cue to move on and doesn't load the last data pull. Any idea on how to ensure it adds the final page?
let
// API Credentials
oauthSite = "https://<Redacted>.docebosaas.com/oauth2/token",
clientId = "<Redacted>",
clientSecret = "<Redacted>",
// User Credentials
email = "<Redacted>",
pwd = "<Redacted>",
// Assemble Token URL
tokenURL = "client_id=" & clientId & "&" & "client_secret=" & clientSecret & "&" & "grant_type=password&scope=api&username=" & email & "&" & "password=" & pwd,
binaryTokenURL = Text.ToBinary(tokenURL),
// Call for Token
tokenData = Json.Document(Web.Contents(oauthSite, [Headers = [#"Content-Type"="application/x-www-form-urlencoded"], Content=binaryTokenURL])),
// Extract the token from the returned table
token = tokenData[access_token],
site = "https://<Redacted>.docebosaas.com",
ver = "/analytics/v1/",
page = "reports/",
idReport = "<Redacted>",
csv = "export/csv",
limit = 1000, // Adjust the limit based on your requirement
endPoint = site & ver & page & idReport & csv,
// Retrieve Execution ID using Token
Source = Json.Document(Web.Contents(endPoint, [Headers=[Authorization="Bearer " & token, Accept="application/json"]])),
data0 = Source[data],
executionId = data0[executionId],
endpoint2 = site & ver & page & idReport & "exports/" & executionId & "/results" & "?pageSize=" & Text.From(limit),
// Retrieve Paginated Data
GetPages = (url as text) as list =>
let
GetPageData = (url as text) =>
let
SourceGet = Json.Document(Web.Contents(url, [Headers=[Authorization="Bearer " & token, Accept="application/json"]])),
dataGet = try SourceGet[data] otherwise null,
nextTokenGet = try SourceGet[nextToken] otherwise null,
Encoded = try Uri.EscapeDataString(nextTokenGet) otherwise null
in
[Data = dataGet, NextToken = Encoded],
initialData = GetPageData(url),
accumulatedData = List.Generate(
() => initialData,
each [NextToken] <> null,
each GetPageData(url & "&nextToken=" & [NextToken]),
each [Data]
),
result = List.Combine(accumulatedData)
in
result,
GetPagesDelay = (url as text) => Function.InvokeAfter(() => GetPages(url), #duration(0, 0, 0, 15)),
DataSet = GetPagesDelay(endpoint2),
in
DataSet
Check out the October 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
68 | |
64 | |
27 | |
18 | |
13 |