Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

suparnababu8

Auto-Magic Data Ingestion in Microsoft Fabric Lakehouse using Data Activator & Pipelines

In this blog, I’ll walk you through a step-by-step implementation of a solution that monitors file uploads to a Lakehouse and transforms those into Delta Tables using Fabric’s low-code tools. This workflow is completely dynamic and extensible-perfect for real-time ingestion pipelines.

 

Setting Up the Fabric Workspace and Lakehouse

suparnababu8_0-1752492455329.png

We begin by creating a Fabric Workspace called Blog_WS(1) and a Lakehouse named Blog_LH(2). Now we will go to Lakehouse and upload csv file from my local drive. 

suparnababu8_1-1752492510445.png

A sample orders.csv file is uploaded to the Files section of the Lakehouse.

suparnababu8_2-1752492526473.png

Using the file menu (...), we click Load to Tables (3).

suparnababu8_0-1752492622433.png

This action manually loads the file into the Tables section as a Delta Table (4).
This step is manual - our goal is to automate this workflow.
To prep for automation, we delete both the uploaded file and the Delta table to simulate a fresh workflow scenario.

 

Designing the Ingestion Pipeline

Step 1: Add Get Metadata Activity:

suparnababu8_1-1752492858941.png

We now build a Data Pipeline called Blog_pipeline(5) using the pipelinecomponents. Now click on Get metadata(6) activity

suparnababu8_2-1752492906471.png

 

Now Get Metadata(7) activity added to canvas, then click on Settings(8) select Blog_LH(9) as connection.

suparnababu8_3-1752492934748.png

Now choose Files(10) of Root folder, DelimtedText(11) as the File Format, click on +New(12) and select Child Items(13) from Argument, it retrieves filenames dynamically.

 

Step 2: Add ForEach Activity:

suparnababu8_4-1752493008450.png

Now added a ForEach(14) activity then Connect Get Metadata to a ForEach activity. Click settings(15), Enable Sequential activity check box(16), configure items field(17) the it will open a window on right side, then select Get Metadata childitems(18) now code generated in the code box(19) then click ok.

 

Step 3: Add Copy Data Activity Inside ForEach

suparnababu8_5-1752493090995.png

 

Now you will be seeing code in Items. Click on + icon(20) in ForEach and select Copy data(21).

suparnababu8_6-1752493129922.png

Now we can see Copy data(22) activity inside the ForEach then click on Source(23) choose Blog_LH(24) from connections , choose Files(25) from Root folder, enter code(26) in File path, enable Recursively(27) check box and select DelimitedText(28) from File format.

suparnababu8_7-1752493157110.png

 

Now click on Destination(29) choose Blog_LH(30) from connection , select the Tables(31) from Root folder, enter code(32) in Table to find path dynamically and select Overwrite(33) from Table action.

suparnababu8_8-1752493189339.png

Now click on Save(34). Pipeline successfully saved(35).

 

Configuring Event Triggers via Data Activator

To eliminate manual initiation, we set up an event-driven trigger

suparnababu8_0-1752493330729.png

 

Click on Add trigger(36), now you’ll see a Set Alert window then click on select events(37)

suparnababu8_1-1752493366925.png

After clicking on the select events, you’ll see this now you can select One Lake events(38)

suparnababu8_2-1752493384245.png

In Select Events, choose Microsoft.Fabric.OneLake.FileCreated(39) and click on Add a OneLake source(40)

suparnababu8_3-1752493405811.png

Choose Blog_LH(41) and click on Next(42)

 

suparnababu8_4-1752493420356.png

 

Select Files(43) and click on Add(44)

suparnababu8_5-1752493454161.png

 

Now Path selected and click on Next(45)

suparnababu8_6-1752493488880.png

Click on Save(46)

suparnababu8_7-1752493516936.png

Now you can see OneLake events(47) , Configure additional settings (48) and click on Create(49).
With this, any new file placed in the Lakehouse Files folder automatically kicks off the pipeline.

 

Testing the End-to-End Automation

suparnababu8_0-1752493574231.png

Alert created. Click on Open(50)

suparnababu8_1-1752493600176.png

Now we navigate to Activator Tab. It’s already started. Now we’ll upload a csv file in our Files section.

suparnababu8_2-1752493636563.png

Uploaded orders.csv in Files section.(52)

suparnababu8_3-1752493662017.png

Now if we look into activator tab, event detected (53).

suparnababu8_4-1752493708807.png

Now in Data Pipeline Tab if you click on View Run History, Pipeline run started(54) if we want to monitor click on Go to Monitor(55)

suparnababu8_5-1752493740765.png

In Monitor Tab, we can see pipeline run succeeded (56). Now we’ll check in Lakehouse Tables section

suparnababu8_6-1752493766139.png

In Tables, the Orders Delta Table appears automatically (57)

 

Final Thoughts:

suparnababu8_0-1752494200811.png

This implementation demonstrates the power and elegance of Microsoft Fabric in orchestrating real-time data workflows. By leveraging:

• Workspaces
• Lakehouse
• Pipelines
• Event-Driven Triggers via Data Activator


we’ve built a zero-code, event-based ingestion system that transforms file uploads into Delta Tables dynamically-at scale.
No more manual refreshes. No more script errors. Just automated intelligence woven directly into your Lakehouse architecture.


With this approach, whenever a file is dropped into the Files section of the Lakehouse, it is automatically ingested and loaded as a Delta Table in the Tables section - with zero manual intervention. That’s the magic of Microsoft Fabric

 

Take a look at the tutorial for more detailshttps://youtu.be/LjIcNXsSKSg?si=H3E55PX1zWfziywP 


Let’s automate smart, not hard!

— Inturi Suparna Babu

LinkedIn