Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Mehay
Helper I
Helper I

Dynamic Date and element in Power BI URL

Hi everyone, 

 

Plz help me on this.

 

1. I'm trying to format my URL in the way that it takes 2-3 days prior data

I found out that this code (below) will work for today, but how do I change it 2-3 days before in URL itself?

 

Date = DateTime.ToText((DateTime.LocalNow()), "yyyy-MM-dd"),

 

e.g. Today is 12 Jun, but the data it will show in URL is the latest data avialable, which would be 9 or 10th Jun.

 

So basically i need to create in URL date that would pick up the latest avialable data.

 

 

 

2. In URL some elements would change either number/characteristic. 

Is it possible to create something similar to Pythong, as:

 

for x in range(0,100):
ant then include str(x) into URL, so it will automatically pick up number from 0-100 and relatest to latest date avialable.
 
Thanks alot in advance, any help highly appreciated!
2 ACCEPTED SOLUTIONS
camargos88
Community Champion
Community Champion

@Mehay ,

 

Yes, it's like a macro in Excel.

It keeps all the steps and apply them when the refresh happens.



Did I answer your question? Mark my post as a solution!

Proud to be a Super User!



View solution in original post

camargos88
Community Champion
Community Champion

Hi @Mehay ,

 

We get this error if the page load is slower than the Power Query request, to fix it, use this code:

 

let
    Source = Web.BrowserContents("https://www.apple.com/covid19/mobility", [WaitFor = [Timeout = #duration(0,0,0,3)]]),
    Extract = Html.Table(
                            Source, 
                            {
                                {"Link", "#download-card > div.download-button-container > a", each [Attributes][href]}
                            },
                             [RowSelector="#download-card > div.download-button-container"]
                        ){0}[Link],
    GetData = Csv.Document(Web.Contents(Extract),[Delimiter=",",Encoding=65001, QuoteStyle=QuoteStyle.None])
in
    GetData

 

I just added a timer to delay the load of the page in Power Query.



Did I answer your question? Mark my post as a solution!

Proud to be a Super User!



View solution in original post

16 REPLIES 16
Mehay
Helper I
Helper I

Just for better understnading.

 

Thats URL I'm trying to deal with, Apple Mobility Trand report:

 

https://covid19-static.cdn-apple.com/covid19-mobility-data/2009HotfixDev28/v3/en-us/applemobilitytrends-2020-06-09.csv

 

I highlighted, what changes with every new report published.

So how can I automize it, in order to get latest report to be uploaded into my Power BI?

 

Thanks in advance.

camargos88
Community Champion
Community Champion

Hi @Mehay ,

 

Create this recursive function and run it with current date:

 

 

(dt as date) as table =>
let
Url =
Web.Contents(
 "https://covid19-static.cdn-apple.com/covid19-mobility-data/2009HotfixDev28/v3/en-us/applemobilitytrends-"&Date.ToText(dt, "yyyy-MM-dd")&".csv"
,[ManualStatusHandling={404}]
),
Result =
if Value.Metadata(Url)[Response.Status] = 404 then
@Query1(Date.AddDays(dt, -1))
else Csv.Document(Url,[Delimiter=",",Encoding=65001, QuoteStyle=QuoteStyle.None])
in
Result

 

Replace the Query1 reference with the function name.

 

Capture.PNG

 



Did I answer your question? Mark my post as a solution!

Proud to be a Super User!



Hi @camargos88 ,

 

Indeed Invoked function is a cool one, i do personally like it.

But in this case it won't make it automized the way i would make it in python let's say.

 

Moreover., whenever you choose new date of daat to be uploaded, it would open as new Invoked Function every time.

And that means I need to apply all my filters again and again, os is there a way so function wont create new one but will replace existing with all its filters?

 

Sorry, im quite new in Power BI, and therefore, not that familiar with M language too.

 

Thnaks.

camargos88
Community Champion
Community Champion

Hi @Mehay ,

 

You just need to create a table with 1 valiue (current date using Date.From(DateTime.LocalNow()), use the function and expand the table.

 

This recursive function will decrease the date and try to find a valid url.



Did I answer your question? Mark my post as a solution!

Proud to be a Super User!



Hi  @camargos88,

 

You mean right here ? DId i get it right?

 

And that code will mean, it will take data aoutomatically as the latest one whenever i click on refresh right? So basically i wont need to type dates in Invoke Function anymore?

 

Modified fuction as per you prev message.

let
Source = #"Apple Mobility Trend"(Date.From(DateTime.LocalNow())),

 

Thanks in advance,

Mehdi

camargos88
Community Champion
Community Champion

@Mehay ,

 

First create the function, I named it as fn_FindValidURL:

 

 

 

(dt as date) as table =>
let
Url =
Web.Contents(
 "https://covid19-static.cdn-apple.com/covid19-mobility-data/2009HotfixDev28/v3/en-us/applemobilitytrends-"&Date.ToText(dt, "yyyy-MM-dd")&".csv"
,[ManualStatusHandling={404}]
),
Result =
if Value.Metadata(Url)[Response.Status] = 404 then
@fn_FindValidURL(Date.AddDays(dt, -1))
else Csv.Document(Url,[Delimiter=",",Encoding=65001, QuoteStyle=QuoteStyle.None])
in
Result

 

 

 

 

And use this mcode to call the function with the current date:

 

let
Source = fn_FindValidURL(Date.From(DateTime.LocalNow())),
#"Promoted Headers" = Table.PromoteHeaders(Source, [PromoteAllScalars=true]),
#"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"geo_type", type text}, {"region", type text}, {"transportation_type", type text}, {"alternative_name", type text}, {"sub-region", type text}, {"country", type text}, {"2020-01-13", Int64.Type}, {"2020-01-14", Int64.Type}, {"2020-01-15", Int64.Type}, {"2020-01-16", Int64.Type}, {"2020-01-17", Int64.Type}, {"2020-01-18", Int64.Type}, {"2020-01-19", Int64.Type}, {"2020-01-20", Int64.Type}, {"2020-01-21", Int64.Type}, {"2020-01-22", Int64.Type}, {"2020-01-23", Int64.Type}, {"2020-01-24", Int64.Type}, {"2020-01-25", Int64.Type}, {"2020-01-26", Int64.Type}, {"2020-01-27", Int64.Type}, {"2020-01-28", Int64.Type}, {"2020-01-29", Int64.Type}, {"2020-01-30", Int64.Type}, {"2020-01-31", Int64.Type}, {"2020-02-01", Int64.Type}, {"2020-02-02", Int64.Type}, {"2020-02-03", Int64.Type}, {"2020-02-04", Int64.Type}, {"2020-02-05", Int64.Type}, {"2020-02-06", Int64.Type}, {"2020-02-07", Int64.Type}, {"2020-02-08", Int64.Type}, {"2020-02-09", Int64.Type}, {"2020-02-10", Int64.Type}, {"2020-02-11", Int64.Type}, {"2020-02-12", Int64.Type}, {"2020-02-13", Int64.Type}, {"2020-02-14", Int64.Type}, {"2020-02-15", Int64.Type}, {"2020-02-16", Int64.Type}, {"2020-02-17", Int64.Type}, {"2020-02-18", Int64.Type}, {"2020-02-19", Int64.Type}, {"2020-02-20", Int64.Type}, {"2020-02-21", Int64.Type}, {"2020-02-22", Int64.Type}, {"2020-02-23", Int64.Type}, {"2020-02-24", Int64.Type}, {"2020-02-25", Int64.Type}, {"2020-02-26", Int64.Type}, {"2020-02-27", Int64.Type}, {"2020-02-28", Int64.Type}, {"2020-02-29", Int64.Type}, {"2020-03-01", Int64.Type}, {"2020-03-02", Int64.Type}, {"2020-03-03", Int64.Type}, {"2020-03-04", Int64.Type}, {"2020-03-05", Int64.Type}, {"2020-03-06", Int64.Type}, {"2020-03-07", Int64.Type}, {"2020-03-08", Int64.Type}, {"2020-03-09", Int64.Type}, {"2020-03-10", Int64.Type}, {"2020-03-11", Int64.Type}, {"2020-03-12", Int64.Type}, {"2020-03-13", Int64.Type}, {"2020-03-14", Int64.Type}, {"2020-03-15", Int64.Type}, {"2020-03-16", Int64.Type}, {"2020-03-17", Int64.Type}, {"2020-03-18", Int64.Type}, {"2020-03-19", Int64.Type}, {"2020-03-20", Int64.Type}, {"2020-03-21", Int64.Type}, {"2020-03-22", Int64.Type}, {"2020-03-23", Int64.Type}, {"2020-03-24", Int64.Type}, {"2020-03-25", Int64.Type}, {"2020-03-26", Int64.Type}, {"2020-03-27", Int64.Type}, {"2020-03-28", Int64.Type}, {"2020-03-29", Int64.Type}, {"2020-03-30", Int64.Type}, {"2020-03-31", Int64.Type}, {"2020-04-01", Int64.Type}, {"2020-04-02", Int64.Type}, {"2020-04-03", Int64.Type}, {"2020-04-04", Int64.Type}, {"2020-04-05", Int64.Type}, {"2020-04-06", Int64.Type}, {"2020-04-07", Int64.Type}, {"2020-04-08", Int64.Type}, {"2020-04-09", Int64.Type}, {"2020-04-10", Int64.Type}, {"2020-04-11", Int64.Type}, {"2020-04-12", Int64.Type}, {"2020-04-13", Int64.Type}, {"2020-04-14", Int64.Type}, {"2020-04-15", Int64.Type}, {"2020-04-16", Int64.Type}, {"2020-04-17", Int64.Type}, {"2020-04-18", Int64.Type}, {"2020-04-19", Int64.Type}, {"2020-04-20", Int64.Type}, {"2020-04-21", Int64.Type}, {"2020-04-22", Int64.Type}, {"2020-04-23", Int64.Type}, {"2020-04-24", Int64.Type}, {"2020-04-25", Int64.Type}, {"2020-04-26", Int64.Type}, {"2020-04-27", Int64.Type}, {"2020-04-28", Int64.Type}, {"2020-04-29", Int64.Type}, {"2020-04-30", Int64.Type}, {"2020-05-01", Int64.Type}, {"2020-05-02", Int64.Type}, {"2020-05-03", Int64.Type}, {"2020-05-04", Int64.Type}, {"2020-05-05", Int64.Type}, {"2020-05-06", Int64.Type}, {"2020-05-07", Int64.Type}, {"2020-05-08", Int64.Type}, {"2020-05-09", Int64.Type}, {"2020-05-10", Int64.Type}, {"2020-05-11", type text}, {"2020-05-12", type text}, {"2020-05-13", Int64.Type}, {"2020-05-14", Int64.Type}, {"2020-05-15", Int64.Type}, {"2020-05-16", Int64.Type}, {"2020-05-17", Int64.Type}, {"2020-05-18", Int64.Type}, {"2020-05-19", Int64.Type}, {"2020-05-20", Int64.Type}, {"2020-05-21", Int64.Type}, {"2020-05-22", Int64.Type}, {"2020-05-23", Int64.Type}, {"2020-05-24", Int64.Type}, {"2020-05-25", Int64.Type}, {"2020-05-26", Int64.Type}, {"2020-05-27", Int64.Type}, {"2020-05-28", Int64.Type}, {"2020-05-29", Int64.Type}, {"2020-05-30", Int64.Type}, {"2020-05-31", Int64.Type}, {"2020-06-01", Int64.Type}, {"2020-06-02", Int64.Type}, {"2020-06-03", Int64.Type}, {"2020-06-04", Int64.Type}, {"2020-06-05", Int64.Type}, {"2020-06-06", Int64.Type}, {"2020-06-07", Int64.Type}, {"2020-06-08", Int64.Type}, {"2020-06-09", Int64.Type}}),
#"Unpivoted Other Columns" = Table.UnpivotOtherColumns(#"Changed Type", {"geo_type", "region", "transportation_type", "alternative_name", "sub-region", "country"}, "Attribute", "Value"),
#"Renamed Columns" = Table.RenameColumns(#"Unpivoted Other Columns",{{"Attribute", "Date"}}),
#"Changed Type1" = Table.TransformColumnTypes(#"Renamed Columns",{{"geo_type", type text}, {"region", type text}, {"transportation_type", type text}, {"alternative_name", type text}, {"sub-region", type text}, {"country", type text}, {"Date", type date}, {"Value", Int64.Type}})
in
#"Changed Type1"

 

This function will start with the current date and try a valid url, it will stop when the first url is found.

 

The part:

 

if Value.Metadata(Url)[Response.Status] = 404 then
@fn_FindValidURL(Date.AddDays(dt, -1))
else Csv.Document(Url,[Delimiter=",",Encoding=65001, QuoteStyle=QuoteStyle.None])

 

Checks if the url is valid, if you get 404 as return, try again decreasing 1 day...and so on. If the error is <> than 404, read the url.

 

This link is updated with the date, and the previous files are not available anymore, correct ?

 

Capture.PNG

 

"and also this part in URL changes too:  "2009HotfixDev28/" now for 10 Jun data it shows to be 14 instead of 28."

 

If I got the correct current link, it's:

https://covid19-static.cdn-apple.com/covid19-mobility-data/2010HotfixDev14/v3/en-us/applemobilitytrends-2020-06-10.csv

 

It's possible to create a recursive function for it, however we are talking about 3 variables....it's problem to call the link so many times.

 



Did I answer your question? Mark my post as a solution!

Proud to be a Super User!



and also this part in URL changes too:  "2009HotfixDev28/" now for 10 Jun data it shows to be 14 instead of 28.

 

Thanks, 

Mehdi

camargos88
Community Champion
Community Champion

@Mehay ,

 

We can try another approach for this, scrap this page and get the link directly:

 

 

let
    Source = Web.BrowserContents("https://www.apple.com/covid19/mobility"),
    Extract = Html.Table(
                            Source, 
                            {
                                {"Link", "#download-card > div.download-button-container > a", each [Attributes][href]}
                            },
                            [RowSelector="#download-card > div.download-button-container"]
                        ){0}[Link],
    GetData = Csv.Document(Web.Contents(Extract),[Delimiter=",",Encoding=65001, QuoteStyle=QuoteStyle.None])
in
    GetData

 

 

 



Did I answer your question? Mark my post as a solution!

Proud to be a Super User!



the 2nd approach basically works well, it got exactly what I need.

 

Just a quick question, will my all filters remain same even after new data uploads in the apple system?

camargos88
Community Champion
Community Champion

@Mehay ,

 

Yes, it's like a macro in Excel.

It keeps all the steps and apply them when the refresh happens.



Did I answer your question? Mark my post as a solution!

Proud to be a Super User!



Dear @camargos88 ,

 

Now every time i try to refresh data it gives me error liek this with href file. Sometimes i open codes and just click okay and then data seems to be fine again, not sure what causes the problem.

 

Screenshot (3).png

 

 

camargos88
Community Champion
Community Champion

Hi @Mehay ,

 

We get this error if the page load is slower than the Power Query request, to fix it, use this code:

 

let
    Source = Web.BrowserContents("https://www.apple.com/covid19/mobility", [WaitFor = [Timeout = #duration(0,0,0,3)]]),
    Extract = Html.Table(
                            Source, 
                            {
                                {"Link", "#download-card > div.download-button-container > a", each [Attributes][href]}
                            },
                             [RowSelector="#download-card > div.download-button-container"]
                        ){0}[Link],
    GetData = Csv.Document(Web.Contents(Extract),[Delimiter=",",Encoding=65001, QuoteStyle=QuoteStyle.None])
in
    GetData

 

I just added a timer to delay the load of the page in Power Query.



Did I answer your question? Mark my post as a solution!

Proud to be a Super User!



@camargos88 ,

 

Once again thanks for your help, seems like working for now, hopefully its only error.

Have a great day ahead!

 

Regards, 

Mehdi

@camargos88 ,

 

Perfect, really appreciate your help, and thanks for all effort! 

It worked perfectly, so now i just need to put a time slot for its own refresh once  a day.

 

Regards, 

Mehdi

Dear @camargos88,

 

Appreciate for you help, first of all.

 

Yes, your function basically worked fine, i just copied all codes given by you.

The thing is if I dont change 28 in URL to 14, it would still stick to data untill 9 Jun.

So only when i changed it to 14, it gave me needed data untill 10 Jun.

 

I checked with old URL, and it's probably still saved within Apple database with the same old link but wont be avialable publically in the website. 

The website would provide only lateast data.

So I think that would be another problem with this formula.

 

Just my thought, is it possible to have codes with that stick strictly to d-2, so 2 days before, e.i. today 12 Jun, so to get 10 Jun), and then for that middle number in the URL it will count from 0-100, and therefore, the system will try to figure out wich number should match with 10 Jun data.

 

Thanks again.

Mehdi

Anonymous
Not applicable

Hi Mehdi,

 

Struggling with the same code could you forward it to me please.

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors