Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
MG86
Advocate II
Advocate II

How to schedule pipeline to run every 30 minutes between specific times?

Hi,

I want to refresh a pipeline every half hour, but only from 6 am till 8 pm so we don't waste CU's at night. 
However the schedule settings are very limited (on purpose?). Per minute allows me to run it every half hour continiously, but does not allow me to set a timespan in which it should be active. 

Daily allows me to enter set times manually, but only up to 10 weirdly enough (with auto refresh on semantic models you can add much more). 

One 'solution' mentioned here by microsoft was to create multiple pipelines that each run a part of the day and invoke the main pipeline from them. But I'm hoping there's a better way. 

Is there maybe an activity I can add to the pipeline that checks the time and only if the time is between 6 am and 8pm will continue to run?

I prefer not to use other Azure subscriptions such as Azure Functions as the whole idea behind Fabric should be that it is all you need.

1 ACCEPTED SOLUTION
NandanHegde
Super User
Super User

Fabric data pipelines have now an event based trigger in preview, so in case if you are comfortable using other offerings like logic app or Azure data factroy or power automate which allow you flexible schedules; you can create a job there that uploads the file in blob and event triggers the pipelines or triggers the pipeline via REST API.

 

BUt to answer your query, the way would be use IF activity at the beginning to check whether the pipeline trigger time is in the range that you need, if yes then within true section have your further flow else skip.

 

@pipeline().TriggerTime this is the system variable which we can use



----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

View solution in original post

1 REPLY 1
NandanHegde
Super User
Super User

Fabric data pipelines have now an event based trigger in preview, so in case if you are comfortable using other offerings like logic app or Azure data factroy or power automate which allow you flexible schedules; you can create a job there that uploads the file in blob and event triggers the pipelines or triggers the pipeline via REST API.

 

BUt to answer your query, the way would be use IF activity at the beginning to check whether the pipeline trigger time is in the range that you need, if yes then within true section have your further flow else skip.

 

@pipeline().TriggerTime this is the system variable which we can use



----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

JanFabricDE_carousel

Fabric Monthly Update - January 2025

Explore the power of Python Notebooks in Fabric!

JanFabricDW_carousel

Fabric Monthly Update - January 2025

Unlock the latest Fabric Data Warehouse upgrades!

JanFabricDF_carousel

Fabric Monthly Update - January 2025

Take your data replication to the next level with Fabric's latest updates!