The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Has anyone ever worked with sending a JSON payload with date ranges but building a query to automate the values?
For example, the payload has:
{
"id":123456,
"filters": [{
"name":"dateCreated",
"values":["2016-07-01T00:00:00Z","2016-08-01T00:00:00Z"]
}]
}
If I do a monthly range, the data there is larger than the return as it limits it to 10,000 records. I need to chunk up the requests to gather pieces of the data but build the table for the entire range. I need to start with 2016-07-01 and pull up to current date. However, older data won't need to get pulled again on refreshes, only current data that hasn't been pulled anywhere. I've got the query working in advanced editor, it's just now getting this advanced part working. Here's the full query:
let
AuthKey = "XXXXXXXXXXXXXXXXXXXXXXXX",
url="XXXXXXXXXXXXXXXXXXXXXXXXXXXX",
body = "
{
"id":123456,
"filters": [{
"name":"dateCreated",
"values":["2016-07-01T00:00:00Z","2016-08-01T00:00:00Z"]
}]
}",
Source = Json.Document(Web.Contents(url,[
Headers = [#"Authorization"=AuthKey ,
#"Content-Type"="application/json"],
Content = Text.ToBinary(body)
]
)),
in
Source
Any help would be greatly appreciated! Thank you!
Hi @dstanisljevic ,
You want to pull the data and incremental refresh the new data at the same time, right? If you have premium licence, you could try to set incremental refresh in Power BI. If you just have pro license, you still need to pull all the data and then configure schedule refresh for it.
Best Regards,
Xue Ding
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly. Kudos are nice too.
Hi Xue,
Thank for the details, but that won't work. The date ranges need to be in the API call itself to get the data. That call needs to happen repeatedly and roll through the date ranges to build the table of data.