Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowFabric Data Days Monthly is back. Join us on March 26th for two expert-led sessions on 1) Getting Started with Fabric IQ and 2) Mapping & Spacial Analytics in Fabric. Register now
Hi,
I'd like to know if it exists some tools to facilitate the data migration to a Fabric SQL Database.
It seems that the Migration Assistant is specific for a data warehouse.
thanks
Solved! Go to Solution.
Hi @AhmedMamdoh ,
Thank you for reaching out to the Microsoft Community Forum.
Hi @neelball and @lbendlin , Thank you for your prompt responses.
Hi @AhmedMamdoh ,
The Migration Assistant is scoped to Fabric Data Warehouse, not the operational Fabric SQL Database.
As mentioned by @neelball , Microsoft is working on migration for moving SQL databases into Fabric.
Please try below alternative workarounds to facilitate the data migration to a Fabric SQL Database.
1. By using copy activity in fabric Data Factory, you can create a pipeline that copies data from SQL server, Azure SQL or Oracle and load directly into Fabric SQL Database tables.
2. By using Fabric Dataflow Gen2, you can output the final tables directly into a Fabric SQL Database.
3. By using Notebooks (PySpark) to load data, you can read the source and insert into SQL Database.
Please refer below sample code.
jdbc_url = "jdbc:sqlserver://<fabric-sql-endpoint>"
df.write \
.format("jdbc") \
.option("url", jdbc_url) \
.option("dbtable", "dbo.TargetTable") \
.mode("append") \
.save()
Note: The above code works only if your data is in Azure Storage / OneLake, Accessible via JDBC and in CSV/Parquet/Delta. Otherwise you will need to change the above code.
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
Hi @AhmedMamdoh ,
Thank you for reaching out to the Microsoft Community Forum.
Hi @neelball and @lbendlin , Thank you for your prompt responses.
Hi @AhmedMamdoh ,
The Migration Assistant is scoped to Fabric Data Warehouse, not the operational Fabric SQL Database.
As mentioned by @neelball , Microsoft is working on migration for moving SQL databases into Fabric.
Please try below alternative workarounds to facilitate the data migration to a Fabric SQL Database.
1. By using copy activity in fabric Data Factory, you can create a pipeline that copies data from SQL server, Azure SQL or Oracle and load directly into Fabric SQL Database tables.
2. By using Fabric Dataflow Gen2, you can output the final tables directly into a Fabric SQL Database.
3. By using Notebooks (PySpark) to load data, you can read the source and insert into SQL Database.
Please refer below sample code.
jdbc_url = "jdbc:sqlserver://<fabric-sql-endpoint>"
df.write \
.format("jdbc") \
.option("url", jdbc_url) \
.option("dbtable", "dbo.TargetTable") \
.mode("append") \
.save()
Note: The above code works only if your data is in Azure Storage / OneLake, Accessible via JDBC and in CSV/Parquet/Delta. Otherwise you will need to change the above code.
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
You can use the standard Import/Export wizard that comes with SQL Server, or other types of SSIS packages.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Fabric update to learn about new features.