Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi,
I want to refresh a pipeline every half hour, but only from 6 am till 8 pm so we don't waste CU's at night.
However the schedule settings are very limited (on purpose?). Per minute allows me to run it every half hour continiously, but does not allow me to set a timespan in which it should be active.
Daily allows me to enter set times manually, but only up to 10 weirdly enough (with auto refresh on semantic models you can add much more).
One 'solution' mentioned here by microsoft was to create multiple pipelines that each run a part of the day and invoke the main pipeline from them. But I'm hoping there's a better way.
Is there maybe an activity I can add to the pipeline that checks the time and only if the time is between 6 am and 8pm will continue to run?
I prefer not to use other Azure subscriptions such as Azure Functions as the whole idea behind Fabric should be that it is all you need.
Solved! Go to Solution.
Fabric data pipelines have now an event based trigger in preview, so in case if you are comfortable using other offerings like logic app or Azure data factroy or power automate which allow you flexible schedules; you can create a job there that uploads the file in blob and event triggers the pipelines or triggers the pipeline via REST API.
BUt to answer your query, the way would be use IF activity at the beginning to check whether the pipeline trigger time is in the range that you need, if yes then within true section have your further flow else skip.
Fabric data pipelines have now an event based trigger in preview, so in case if you are comfortable using other offerings like logic app or Azure data factroy or power automate which allow you flexible schedules; you can create a job there that uploads the file in blob and event triggers the pipelines or triggers the pipeline via REST API.
BUt to answer your query, the way would be use IF activity at the beginning to check whether the pipeline trigger time is in the range that you need, if yes then within true section have your further flow else skip.
User | Count |
---|---|
7 | |
3 | |
2 | |
2 | |
1 |
User | Count |
---|---|
10 | |
9 | |
5 | |
3 | |
3 |