Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
In this blog, I’ll Walk you through how to migrate multiple SQL Server tables into a Fabric Lakehouse using Data Pipelines. This approach is particularly effective for legacy modernization or building scalable ETL frameworks in Microsoft Fabric.
Scenario Setup
In this example, I’m migrating 11 tables(2) from my local SQL Server database named Fabric_db(1) to a Lakehouse in Fabric called Migration_LH.
Step 1: Create a New Data Pipeline
Navigate to Fabric workspace and I opened Lakehouse Migration_LH(3) and click on New data pipeline (4)
Now it’ll open this window. Name pipeline something relevant like migration_pipeline(5)
Step 2: Use a Lookup Activity to Fetch Table Metadata
Add a Lookup activity(7) from the pipeline activitiy(6) pane.
Now click on Lookup activity settings(8)
In the Settings, uncheck the First row only(9) box and Click Connection --> More(10)
select SQL server database (11)
You will be seeing this screen. Fill the the details.
You’ll get server details and database from your on-prem SQL server. Got to on-prem SQL server right click on (12) select properties and you will get properties window. From that copy(13) server details and use Fabric_db as database name.(14)
Step 3: Install and Configure the On-Premises Data Gateway
Since we’re accessing on-prem data, an on-premises data gateway is required to act as a secure bridge between your SQL Server and Fabric. How install gateway click here and Make sure gateway is online and properly configured before proceeding
Now I filled all the details and click on connect (14)
Connecting to on-prem database (15)
Step 4: Configure the Lookup Query
Now Test connection successful(16), choose query(17) and write a query mentioned in the yellow box(18) to copy the schema name and table names
Step 5: Add a ForEach Activity
Now click on Activates(19) and select ForEach(20) activity you’ll see the activity added to canvas(21)
Link the Lookup activity to the ForEach activity (22) click on settings(23) check the sequential box(24) and enter the parameter(25) to capture the output of lookup activity and Inside the ForEach, click the + icon(26)
Step 6: Add a Copy Data Activity Inside ForEach
Select Copy Data(27)
Click on copy activity (28) and click on Source (29), choose SQL Database as the source, test the connection as well and check the box Enter manually box and enter the parameter values(30).
Now click on Destination tab (32) and click on more(33) to add final destination
Now from this window you can select Migration_LH(34)
Step 7: Save and Run the Pipeline
Now Lakehouse (35) added as destination and give parameter in table section (36)
Now click on Run (37)
Now click on Save and Run (38)
Monitor the pipeline status; it should show as In Progress (39)
Now first copy activity completed (40). Let’s check in Lakehouse
Bike data table copied into Lakehouse
Now pipeline run succeeded. In this example, it took approximately 8 minutes to migrate all 11 tables from SQL Server to Fabric Lakehouse
Results & Verification
After a successful run, I was able to view all 11 tables—including the Bike_data table—inside the Fabric Lakehouse. Each table’s data was intact and ready for downstream analytics or visualization.
Conclusion:
This end-to-end migration approach demonstrates the power and simplicity of Microsoft Fabric in modernizing on-premises workloads.
🔄 If you're using Azure Data Factory (ADF), the process remains largely the same—except that Self-hosted Integration Runtime replaces the Data Gateway.
Whether you're moving legacy systems or building a repeatable ETL pattern, this solution offers performance, flexibility, and scalability.
If you found this helpful, please like, comment, or share—and let’s continue growing together in the Fabric community!
Thank you all🙂
✨ Happy Migrating!
— Inturi Suparna Babu
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.