cancel
Showing results for 
Search instead for 
Did you mean: 

Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.

Reply
Laila92
Helper V
Helper V

Powerquery looping over paginated API - dynamic data source error

I am trying to retrieve data from a paginated API - specifically the Zendesk one. The code refreshes on my desktop, but when i try to refresh online i get the 'This dataset includes a dynamic data source. Since dynamic data sources aren't refreshed in the Power BI service, this dataset won't be refreshed.' error. I am working with the relative path, and the baseurl I use works by itself.

Any tips?

 

I tried two ways in M:

 

1. 

let
    BaseUrl = "https://XXX.zendesk.com",
    Token = "XXX",
    //Headerss = Headers=[ #"Authorization" = "Basic " & Token ],

    StartingOptions = [RelativePath="/api/v2/incremental/tickets.json?include=comment_count&start_time=1633089331",Headers=[ #"Authorization" = "Basic " & Token ]],
    //InitRelativePathdynamic="/api/v2/incremental/tickets.json?include=comment_count&start_time=1641042895",
    //RelativePathdynamic ="/api/v2/incremental/tickets.json?include=comment_count&start_time=" & Text.From(newOffset),
    initReq = Json.Document(Web.Contents(BaseUrl, StartingOptions)),
    initData = initReq[tickets],
    //We want to get data = {lastNPagesData, thisPageData}, where each list has the limit # of Records,
    //then we can List.Combine() the two lists on each iteration to aggregate all the records. We can then
    //create a table from those records
    gather = (tickets as list, BaseUrl,Options) =>
        let
            //get new offset from active uri
            newOffset = Json.Document(Web.Contents(BaseUrl,Options))[end_time],
            //build new uri using the original uri so we dont append offsests
            //newUri = BaseUrl & "/api/v2/incremental/tickets.json?include=comment_count&start_time=" & Text.From(newOffset),
            //get new req & data
            newOptions = [RelativePath="/api/v2/incremental/tickets.json?include=comment_count&start_time=" & Text.From(newOffset),Headers=[ #"Authorization" = "Basic " & Token ]],
            newReq = Json.Document(Web.Contents(BaseUrl,newOptions)),
            newdata = newReq[tickets],
            //add that data to rolling aggregate
            data = List.Combine({tickets, newdata}),
            datacount = List.Count(data),
            //if theres no next page of data, return. if there is, call @gather again to get more data
            check = if Text.From(newReq[end_of_stream]) = "true" or datacount >=300000 then data else @gather(data, BaseUrl,newOptions)
        in check,
    //before we call gather(), we want see if its even necesarry. First request returns only one page? Return.
    outputList =  if Text.From(initReq[end_of_stream]) = "true" then initData else gather(initData, BaseUrl,StartingOptions),
    //then place records into a table. This will expand all columns available in the record.
    expand = Table.FromRecords(outputList),
   
    
in
    #"Removed Duplicates"

2. 

 

let
    BaseUrl = "https://XXX.zendesk.com/api/v2/incremental/tickets.json",
    Token = "XXX",
    StartingOptions = [RelativePath="?include=comment_count&start_time=1633089331",Headers=[ #"Authorization" = "Basic " & Token ]],
    iterations = 10, 
   // Url = 
    initReqwc = Web.Contents(BaseUrl, StartingOptions),
    initReq = Json.Document(initReqwc),
    initData = initReq[tickets],

    FunctionGetOnePage = 
        (BaseUrl,Options) as record =>
        let 
            Source = Json.Document(Web.Contents(BaseUrl,Options)),
            data = try Source[tickets] otherwise null,
            next = try Source[end_time] otherwise null,
            res = [tickets = data,end_time = next]
        in 
            res  ,

 GeneratedList =
  List.Generate(
   ()=>[i=0, res = FunctionGetOnePage(BaseUrl,StartingOptions)],
   each [i]<iterations and [res][tickets]<>null,
   each [i=[i]+1, res = FunctionGetOnePage([res][end_time])],
   each [res][tickets]),
    GeneratedList1 = GeneratedList{0},
    #"Converted to Table" = Table.FromList(GeneratedList1, Splitter.SplitByNothing(), null, null, ExtraValues.Error)

    
in
    #"Converted to Table"

 

2 ACCEPTED SOLUTIONS
Laila92
Helper V
Helper V

Found a solution by putitng the start time parameter in the 'query' part of the webcontents function.

View solution in original post

sorry I just saw your message. Are you working with the zendesk API? i did manage to make it work in the end. you can play around with the data count number/unix start time.

 

let
    
    BaseUrl = "https://xxx.zendesk.com",
    Token = "xxx",
    StartingOptions = [RelativePath="/api/v2/incremental/tickets.json?include=comment_count",Headers=[ #"Authorization" = "Basic " & Token ],Query=[start_time="1633089331"]],
     initReq = Json.Document(Web.Contents("https://xxx.zendesk.com", StartingOptions)),
    initData = initReq[tickets],
    gather = (tickets as list, QParm) =>
        let
         BaseUrl = "https://xxx.zendesk.com",
         Token = "xxx",
         StartingOptions = [RelativePath="/api/v2/incremental/tickets.json?include=comment_count",Headers=[ #"Authorization" = "Basic " & Token ],Query=[start_time=QParm]],
         newOffset = Text.From(Json.Document(Web.Contents("https://xxx.zendesk.com",StartingOptions))[end_time]),
         newOptions = [RelativePath="/api/v2/incremental/tickets.json?include=comment_count",Headers=[ #"Authorization" = "Basic " & Token ],Query=[start_time=newOffset]],
         newReq = Json.Document(Web.Contents("https://xxx.zendesk.com",newOptions)),
         newdata = newReq[tickets],
         data = List.Combine({tickets, newdata}),
         datacount = List.Count(data),
         check = if Text.From(newReq[end_of_stream]) = "true" or datacount >=300000 then data else @gather(data, newOffset)
        in check,
   outputList =  if Text.From(initReq[end_of_stream]) = "true" then initData else gather(initData,"1633089331"),
    expand = Table.FromRecords(outputList),
    #"Filtered Rows" = Table.SelectRows(expand, each ([status] <> "deleted"))
   
in
#"Filtered Rows"

 

View solution in original post

3 REPLIES 3
Laila92
Helper V
Helper V

Found a solution by putitng the start time parameter in the 'query' part of the webcontents function.

@Laila92 - I have a similar issue, are you able to explain a bit more as to what you mean and possibly provide and example please?

sorry I just saw your message. Are you working with the zendesk API? i did manage to make it work in the end. you can play around with the data count number/unix start time.

 

let
    
    BaseUrl = "https://xxx.zendesk.com",
    Token = "xxx",
    StartingOptions = [RelativePath="/api/v2/incremental/tickets.json?include=comment_count",Headers=[ #"Authorization" = "Basic " & Token ],Query=[start_time="1633089331"]],
     initReq = Json.Document(Web.Contents("https://xxx.zendesk.com", StartingOptions)),
    initData = initReq[tickets],
    gather = (tickets as list, QParm) =>
        let
         BaseUrl = "https://xxx.zendesk.com",
         Token = "xxx",
         StartingOptions = [RelativePath="/api/v2/incremental/tickets.json?include=comment_count",Headers=[ #"Authorization" = "Basic " & Token ],Query=[start_time=QParm]],
         newOffset = Text.From(Json.Document(Web.Contents("https://xxx.zendesk.com",StartingOptions))[end_time]),
         newOptions = [RelativePath="/api/v2/incremental/tickets.json?include=comment_count",Headers=[ #"Authorization" = "Basic " & Token ],Query=[start_time=newOffset]],
         newReq = Json.Document(Web.Contents("https://xxx.zendesk.com",newOptions)),
         newdata = newReq[tickets],
         data = List.Combine({tickets, newdata}),
         datacount = List.Count(data),
         check = if Text.From(newReq[end_of_stream]) = "true" or datacount >=300000 then data else @gather(data, newOffset)
        in check,
   outputList =  if Text.From(initReq[end_of_stream]) = "true" then initData else gather(initData,"1633089331"),
    expand = Table.FromRecords(outputList),
    #"Filtered Rows" = Table.SelectRows(expand, each ([status] <> "deleted"))
   
in
#"Filtered Rows"

 

Helpful resources

Announcements
PBI November 2023 Update Carousel

Power BI Monthly Update - November 2023

Check out the November 2023 Power BI update to learn about new features.

Community News

Fabric Community News unified experience

Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.

Power BI Fabric Summit Carousel

The largest Power BI and Fabric virtual conference

130+ sessions, 130+ speakers, Product managers, MVPs, and experts. All about Power BI and Fabric. Attend online or watch the recordings.