The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
In this blog, I’ll walk you through a step-by-step implementation of a solution that monitors file uploads to a Lakehouse and transforms those into Delta Tables using Fabric’s low-code tools. This workflow is completely dynamic and extensible-perfect for real-time ingestion pipelines.
Setting Up the Fabric Workspace and Lakehouse
We begin by creating a Fabric Workspace called Blog_WS(1) and a Lakehouse named Blog_LH(2). Now we will go to Lakehouse and upload csv file from my local drive.
A sample orders.csv file is uploaded to the Files section of the Lakehouse.
Using the file menu (...), we click Load to Tables (3).
This action manually loads the file into the Tables section as a Delta Table (4).
This step is manual - our goal is to automate this workflow.
To prep for automation, we delete both the uploaded file and the Delta table to simulate a fresh workflow scenario.
Designing the Ingestion Pipeline
Step 1: Add Get Metadata Activity:
We now build a Data Pipeline called Blog_pipeline(5) using the pipelinecomponents. Now click on Get metadata(6) activity
Now Get Metadata(7) activity added to canvas, then click on Settings(8) select Blog_LH(9) as connection.
Now choose Files(10) of Root folder, DelimtedText(11) as the File Format, click on +New(12) and select Child Items(13) from Argument, it retrieves filenames dynamically.
Step 2: Add ForEach Activity:
Now added a ForEach(14) activity then Connect Get Metadata to a ForEach activity. Click settings(15), Enable Sequential activity check box(16), configure items field(17) the it will open a window on right side, then select Get Metadata childitems(18) now code generated in the code box(19) then click ok.
Step 3: Add Copy Data Activity Inside ForEach
Now you will be seeing code in Items. Click on + icon(20) in ForEach and select Copy data(21).
Now we can see Copy data(22) activity inside the ForEach then click on Source(23) choose Blog_LH(24) from connections , choose Files(25) from Root folder, enter code(26) in File path, enable Recursively(27) check box and select DelimitedText(28) from File format.
Now click on Destination(29) choose Blog_LH(30) from connection , select the Tables(31) from Root folder, enter code(32) in Table to find path dynamically and select Overwrite(33) from Table action.
Now click on Save(34). Pipeline successfully saved(35).
Configuring Event Triggers via Data Activator
To eliminate manual initiation, we set up an event-driven trigger
Click on Add trigger(36), now you’ll see a Set Alert window then click on select events(37)
After clicking on the select events, you’ll see this now you can select One Lake events(38)
In Select Events, choose Microsoft.Fabric.OneLake.FileCreated(39) and click on Add a OneLake source(40)
Choose Blog_LH(41) and click on Next(42)
Select Files(43) and click on Add(44)
Now Path selected and click on Next(45)
Click on Save(46)
Now you can see OneLake events(47) , Configure additional settings (48) and click on Create(49).
With this, any new file placed in the Lakehouse Files folder automatically kicks off the pipeline.
Testing the End-to-End Automation
Alert created. Click on Open(50)
Now we navigate to Activator Tab. It’s already started. Now we’ll upload a csv file in our Files section.
Uploaded orders.csv in Files section.(52)
Now if we look into activator tab, event detected (53).
Now in Data Pipeline Tab if you click on View Run History, Pipeline run started(54) if we want to monitor click on Go to Monitor(55)
In Monitor Tab, we can see pipeline run succeeded (56). Now we’ll check in Lakehouse Tables section
In Tables, the Orders Delta Table appears automatically (57).
Final Thoughts:
This implementation demonstrates the power and elegance of Microsoft Fabric in orchestrating real-time data workflows. By leveraging:
• Workspaces
• Lakehouse
• Pipelines
• Event-Driven Triggers via Data Activator
we’ve built a zero-code, event-based ingestion system that transforms file uploads into Delta Tables dynamically-at scale.
No more manual refreshes. No more script errors. Just automated intelligence woven directly into your Lakehouse architecture.
With this approach, whenever a file is dropped into the Files section of the Lakehouse, it is automatically ingested and loaded as a Delta Table in the Tables section - with zero manual intervention. That’s the magic of Microsoft Fabric
Take a look at the tutorial for more details : https://youtu.be/LjIcNXsSKSg?si=H3E55PX1zWfziywP
Let’s automate smart, not hard!
— Inturi Suparna Babu
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.