Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
stefaal
Frequent Visitor

Need help with power Query to schedule dynamic refresh of data and keep old data

Hello,

 

  • I need to keep the data for the past 24 hours that I refresh in power BI desctop (thats how I model my report and dataset) 
  • I have premium and pro account
  • I have set RangeStart  and RangeEnd parameters
  • schedule data refresh and update the data in first point every 15 minutes in power BI workspace ( here I get that my data source is not supported for refresh error - " This dataset includes a dynamic data source. "
  • I have read this blog post about “Faking Out” Web.Contents (http://blog.datainspirations.com/2018/02/17/dynamic-web-contents-and-power-bi-refresh-errors/), but I cannot seem to get it working.

Below is my query, any help would be much appreciated:

let
EntitiesPerPage = 500,
Limit="&limit=" & Text.From(EntitiesPerPage),
Url = "https://*******.********.com/api/1/rest/public/runtime/dev_org2?org_wide=true&last_hours=24" & Limit,
GetJson = (Url) =>
let
RawData = Web.Contents(Url),
Json = Json.Document(RawData)
in Json,

GetEntityCount = () =>
let Url = Url & "&offset=0",
Json = GetJson(Url),
Count = Json[#"response_map"],
Count1 = Count[#"total"]
in
Count1,

GetPage = (Index) =>
let
//(option A)offset equal to previous row count
offset = "&offset=" & Text.From(Index * EntitiesPerPage),
//(option B)offset equal to page numer
//offset = "&offset=" & Text.From(Index),
Url = Url & offset,
Json = GetJson(Url),
Value = Json[#"response_map"],
Value1 = Value[#"entries"]
in
Value1,

EntityCount = GetEntityCount(),
PageCount = Number.RoundUp(EntityCount / EntitiesPerPage),
PageIndices = { 0 .. PageCount - 1 },
Pages = List.Transform(PageIndices, each GetPage(_)),
Entities = List.Union(Pages),
Table = Table.FromList(Entities, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(Table, "Column1", {"documents", "state_timestamp", "error_documents", "label", "path_id", "state", "create_time", "duration", "cc_label", "runtime_label"}, {"Column1.documents", "Column1.state_timestamp", "Column1.error_documents", "Column1.label", "Column1.path_id", "Column1.state", "Column1.create_time", "Column1.duration", "Column1.cc_label", "Column1.runtime_label"}),
#"Renamed Columns" = Table.RenameColumns(#"Expanded Column1",{{"Column1.cc_label", "cc_label"}, {"Column1.create_time", "create_time"}, {"Column1.documents", "documents"}, {"Column1.duration", "duration"}, {"Column1.error_documents", "error_documents"}, {"Column1.label", "label"}, {"Column1.path_id", "path_id"}, {"Column1.state", "state"}, {"Column1.runtime_label", "runtime_label"}, {"Column1.state_timestamp", "state_timestamp"}}),
#"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"state_timestamp", type datetime}, {"create_time", type datetime}}),
#"Filtered Rows" = Table.SelectRows(#"Changed Type", each [state_timestamp] >= RangeStart and [state_timestamp] < RangeEnd)
in
#"Filtered Rows"

 

Regards,

Aleksandar

3 REPLIES 3
v-deddai1-msft
Community Support
Community Support

Hi @stefaal ,

 

You really need to use The RelativePath And Query Options With Web.Contents() to make your url be static.

 

let
RawData = Json.Document(Web.Contents(https://*******.********.com/api,

 [
  RelativePath="/1/rest/public/runtime/dev_org2?org_wide=true&last_hours=24",
  Query=[Limit]
 ]

)),

in RawData,

 

For more details, please refer to https://blog.crossjoin.co.uk/2016/08/16/using-the-relativepath-and-query-options-with-web-contents-i...

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

Best Regards,

Dedmon Dai

Thanky you guys, I  am still trying to get this working, but no success with my query.

lbendlin
Super User
Super User

RangeStart and RangeEnd parameters are usually only important for incremental refresh data sources. I don't think that applies to your scenario.

 

Since you are already filtering for the last 24 hours in your data source (the web.contents() call), you don't really need to do anything beyond collecting your result pages.  You may want to consider a GenerateList() approach that compiles a list of URLs to fetch based on the [response_map][total]  value of your first call, with the appropriate offset.  That should also avoid the issues you describe with service refresh.

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors