Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

suparnababu8

Seamless Data Migration from On-Prem SQL Server to Fabric Lakehouse

In this blog, I’ll Walk you through how to migrate multiple SQL Server tables into a Fabric Lakehouse using Data Pipelines. This approach is particularly effective for legacy modernization or building scalable ETL frameworks in Microsoft Fabric.

 

Scenario Setup

In this example, I’m migrating 11 tables(2) from my local SQL Server database named Fabric_db(1) to a Lakehouse in Fabric called Migration_LH.

suparnababu8_0-1749215472159.png

 

 

Step 1: Create a New Data Pipeline

Navigate to Fabric workspace and I opened  Lakehouse Migration_LH(3) and click on New data pipeline (4)

suparnababu8_0-1749215628896.png

Now it’ll open this window. Name pipeline something relevant like migration_pipeline(5)

suparnababu8_1-1749215695193.png

Step 2: Use a Lookup Activity to Fetch Table Metadata

Add a Lookup activity(7) from the pipeline activitiy(6) pane.

suparnababu8_2-1749215732182.png

Now click on Lookup activity settings(8)

suparnababu8_3-1749215757497.png

In the Settings, uncheck the First row only(9) box and Click Connection --> More(10)

suparnababu8_4-1749215783510.png

select SQL server database (11)

suparnababu8_5-1749215804090.png

You will be seeing this screen. Fill the the details.

suparnababu8_6-1749215817315.png

You’ll get server details and database from your on-prem SQL server. Got to on-prem SQL server right click on (12) select properties and you will get properties window. From that copy(13) server details and use Fabric_db as database name.(14)

suparnababu8_7-1749215830026.png

Step 3: Install and Configure the On-Premises Data Gateway

Since we’re accessing on-prem data, an on-premises data gateway is required to act as a secure bridge between your SQL Server and Fabric. How install gateway click here and Make sure  gateway is online and properly configured before proceeding

suparnababu8_8-1749215864163.png

Now I filled all the details and click on connect (14)

suparnababu8_9-1749215879345.png

Connecting to on-prem database (15)

suparnababu8_10-1749215890497.png

Step 4: Configure the Lookup Query

Now Test connection successful(16), choose query(17) and write a query mentioned in the yellow box(18) to copy the schema name and table names

suparnababu8_11-1749215909339.png

Step 5: Add a ForEach Activity

Now click on Activates(19) and select ForEach(20) activity you’ll see the activity added to canvas(21)

suparnababu8_12-1749215921088.png

Link the Lookup activity to the ForEach activity (22) click on settings(23) check the sequential box(24) and enter the parameter(25) to capture the output of lookup activity and Inside the ForEach, click the + icon(26)

suparnababu8_13-1749215932192.png

Step 6: Add a Copy Data Activity Inside ForEach

Select Copy Data(27)

suparnababu8_14-1749215944054.png

Click on copy activity (28) and click on Source (29), choose SQL Database as the source, test the connection as well and check the box Enter manually box and enter the parameter values(30).

suparnababu8_15-1749215955287.png

Now click on Destination tab (32) and click on more(33) to add final destination

suparnababu8_16-1749215967994.png

Now from this window you can select Migration_LH(34)

suparnababu8_17-1749215980188.png

Step 7: Save and Run the Pipeline

Now Lakehouse (35) added as destination and give parameter in table section (36)

suparnababu8_18-1749215994297.png

 

Now click on Run (37)

suparnababu8_19-1749216005780.png

Now click on Save and Run (38)

suparnababu8_20-1749216017784.png

Monitor the pipeline status; it should show as In Progress (39)

suparnababu8_21-1749216033737.png

Now first copy activity completed (40). Let’s check in Lakehouse

suparnababu8_0-1749216134825.png

Bike data table copied into Lakehouse

suparnababu8_1-1749216148524.png

Now pipeline run succeeded.  In this example, it took approximately 8 minutes to migrate all 11 tables from SQL Server to Fabric Lakehouse

suparnababu8_2-1749216161296.png

Results & Verification

After a successful run, I was able to view all 11 tables—including the Bike_data table—inside the Fabric Lakehouse. Each table’s data was intact and ready for downstream analytics or visualization.

suparnababu8_3-1749216172360.png

Conclusion:

This end-to-end migration approach demonstrates the power and simplicity of Microsoft Fabric in modernizing on-premises workloads.

 

🔄 If you're using Azure Data Factory (ADF), the process remains largely the same—except that Self-hosted Integration Runtime replaces the Data Gateway.

 

Whether you're moving legacy systems or building a repeatable ETL pattern, this solution offers performance, flexibility, and scalability.

 

If you found this helpful, please like, comment, or share—and let’s continue growing together in the Fabric community! 

 

Thank you all🙂

 

Happy Migrating!

 

— Inturi Suparna Babu

Comments

This is really helpful, thank you 

Hi,

 

Thanks for Sharing the steps to follow for Data Migration from SQL Server to Fabric Lakehouse.

Great work 🥳 @suparnababu8  Thanks for sharing content