The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Good day,
I need help please, i have multiple datasources (Salesforce, Zendesk, Netsuite etc), the Netsuite i execute using an azure function app, however this currently runs daily and the netsuite only finanical data up until th15th and my function app has started becoming expensive - ZAR 3,000 monthly for approximately 20 refreshes. With this cost i could have paid for the license
I am trying to limit this as the rest of the datasources need to refresh daily (mon-fri), is there a way i can split when the datasources refresh
or i am trhinking when i can the azure function in powerquery to build it around an if day <=15 then execute function else do nothing, but wont the service shceduled refresh still execute the azure function regardles off the if statement
This executes the Azure Function and return a token:
let
Source = () =>
let
Source = Table.FromColumns({Lines.FromBinary(Web.Contents("AZURE FUNCTION")}),
Column1 = Source{0}[Column1]
in
Column1
in
Source
Please help, any ideas?
Regards
Solved! Go to Solution.
Hi @icassiem,
Thank you for providing the explanation. I recognize the difficulty involved in balancing cost management with maintaining up-to-date KPIs.
Since Power BI does not support configuring separate refresh schedules for each data source within a single dataset, and using a date condition in your Azure Function call still results in execution (causing token expiration and additional costs), the following approach offers an effective solution:
Split the Dataset Based on Refresh Frequency
To gain more control over refresh timing and reduce Azure Function costs, consider splitting your data sources into two separate Power BI datasets. Dataset A can include Salesforce, Zendesk, and other sources that require a daily refresh, while Dataset B should contain only the NetSuite data fetched via the Azure Function, which you can schedule to refresh only on specific days (e.g., the 1st and 15th of each month). To implement this, move the NetSuite query into a new .pbix file and publish it as a standalone dataset. Then, in your main Power BI report, use Composite Models or DirectQuery for Power BI Datasets to bring in both datasets. This approach keeps your report fully functional while significantly reducing unnecessary Azure Function executions and giving you better control over refresh schedules and associated costs.
Thank you! We’re glad to have you in the Microsoft Fabric Community and feel free to drop by anytime you need help or want to share ideas.
Regards,
CST Member.
Hi @icassiem,
Thank you for providing the explanation. I recognize the difficulty involved in balancing cost management with maintaining up-to-date KPIs.
Since Power BI does not support configuring separate refresh schedules for each data source within a single dataset, and using a date condition in your Azure Function call still results in execution (causing token expiration and additional costs), the following approach offers an effective solution:
Split the Dataset Based on Refresh Frequency
To gain more control over refresh timing and reduce Azure Function costs, consider splitting your data sources into two separate Power BI datasets. Dataset A can include Salesforce, Zendesk, and other sources that require a daily refresh, while Dataset B should contain only the NetSuite data fetched via the Azure Function, which you can schedule to refresh only on specific days (e.g., the 1st and 15th of each month). To implement this, move the NetSuite query into a new .pbix file and publish it as a standalone dataset. Then, in your main Power BI report, use Composite Models or DirectQuery for Power BI Datasets to bring in both datasets. This approach keeps your report fully functional while significantly reducing unnecessary Azure Function executions and giving you better control over refresh schedules and associated costs.
Thank you! We’re glad to have you in the Microsoft Fabric Community and feel free to drop by anytime you need help or want to share ideas.
Regards,
CST Member.
Thank You @v-sgandrathi
I have read something similar but it feels like a redevelop of the solution.
I think, i will try and get the refresh rate down first to like 3 times a month to get cost closer to ZAR 2000
not ideal but going live to board members, i cant seem to break it and then redo 😞
@icassiem You can modify your Power Query to conditionally call the Azure Function based on the day of the month. This way, the function will only execute when necessary. However, this approach will still trigger the scheduled refresh, which might not fully solve the cost issue.
m
let
currentDate = DateTime.LocalNow(),
currentDay = Date.Day(currentDate),
Source = if currentDay <= 15 then
Table.FromColumns({Lines.FromBinary(Web.Contents("AZURE FUNCTION"))})
else
null
in
Source
Instead of refreshing the data source daily, you can adjust the refresh schedule to run less frequently for the NetSuite data. For example, you can set it to refresh only on specific days of the month (e.g., the 1st and 15th). This can be done using the scheduling options in your data integration tool or service.
Proud to be a Super User! |
|
Thank You @bhanu_gautam
I have included this day clause in my azure app function call from within powerquery, but like you mention the service daily refresh might still refresh the datasource
i cant split the schedule as my kpi scorecard other kpi data sources i cant split, one can only set all the data sources to refreshed and not certain
Any other ideas please?
Regards
Also if i put the day condition around the function call the sources fail as the token has then expired due to the non token refresh from azure function app
?