The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Today’s organizations demand real-time responsiveness from their analytics platforms. When data processing relies on scheduled job runs, insights and actions are delayed, and decisions are based on stale data. Whether your data lands in Azure Blob Storage or OneLake, it should be processed the moment it arrives to ensure timely decisions and continuous data freshness. Fabric events and Azure events make that possible by enabling event-driven data workflows that react in real-time to new data, without manual triggers or schedules.
In this blog, you’ll learn how to configure an event-driven data pipeline that automatically gets triggered when a new file lands in OneLake or Azure Blob Storage, to ingest and transform the new file.
Fabric jobs, like data pipelines and notebooks, can be scheduled to run at fixed intervals, but data doesn’t always arrive on a predictable schedule. This mismatch can lead to stale data and delayed insights.
Fabric events and Azure events solve this problem by emitting events when a file is created, updated, or deleted in OneLake or Azure blob storage. These events can be consumed by Activator that can trigger Fabric items (e.g., data pipelines or notebooks) or Power Automate workflows.
This event-driven workflow enables:
In this tutorial, you will:
First, Let’s create a Lakehouse where we will upload the CSV files and have the resulting table.
Next, configure a data pipeline to ingest, transform and deliver the data in your Lakehouse
This setup ensures your pipeline runs instantly whenever a new file appears in the source folder.
To test your workflow:
No manual refresh. No waiting for the next scheduled run. Your pipeline runs in real-time.
With just a few steps, we’ve built a responsive, event-driven workflow. Every time data lands in your Lakehouse, it’s automatically ingested, transformed, and ready for downstream analytics. While this demo focused on OneLake Events, you can achieve the same scenario using Azure Blob Storage events.
Beyond the use case we explored, here are additional scenarios where you can leverage OneLake and Azure Blob Storage events in Microsoft Fabric:
Ready to streamline your Fabric applications with an event-driven architecture? Start exploring Fabric events and Azure events today to unlock real-time automation in your data workflows. To learn more, please go to Azure and Fabric Events documentation.
Stay tuned for new event group types, consumers, and enhancements for Azure and Fabric Events that will further simplify real-time data processing, automation, and analytics. We are committed to improving the event-driven capabilities in Fabric so we encourage you to share your suggestions and feedback at Fabric Ideas for the Real-Time Hub category and join the conversation on the Fabric Community.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.