Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
03-24-2025 13:45 PM - last edited 03-24-2025 14:54 PM
Watch the video below to see a walkthrough of the Direct Lake Migration process
Check here to see the latest version.
%pip install semantic-link-labs
import sempy_labs as labs from sempy_labs import migration, directlake import sempy_labs.report as rep dataset_name = '' #Enter the import/DQ semantic model name workspace_name = None #Enter the workspace of the import/DQ semantic model. It set to none it will use the current workspace. new_dataset_name = '' #Enter the new Direct Lake semantic model name new_dataset_workspace_name = None #Enter the workspace where the Direct Lake model will be created. If set to None it will use the current workspace. lakehouse_name = None #Enter the lakehouse to be used for the Direct Lake model. If set to None it will use the lakehouse attached to the notebook. lakehouse_workspace_name = None #Enter the lakehouse workspace. If set to None it will use the new_dataset_workspace_name.
This encapsulates all of the semantic model's Power Query logic into a single file.
migration.create_pqt_file(dataset = dataset_name, workspace = workspace_name)
Open the OneLake file explorer and sync your files (right click -> Sync from OneLake)
Navigate to your lakehouse. From this window, create a new Dataflows Gen2 and import the Power Query Template file from OneLake (OneLake -> Workspace -> Lakehouse -> Files...), and publish the Dataflows Gen2.
Calculated columns are not migrated to the Direct Lake model as they are not supported in Direct Lake mode.
import time
labs.create_blank_semantic_model(dataset = new_dataset_name, workspace = new_dataset_workspace_name, overwrite=False)
migration.migrate_calc_tables_to_lakehouse(
dataset=dataset_name,
new_dataset=new_dataset_name,
workspace=workspace_name,
new_dataset_workspace=new_dataset_workspace_name,
lakehouse=lakehouse_name,
lakehouse_workspace=lakehouse_workspace_name
)
migration.migrate_tables_columns_to_semantic_model(
dataset=dataset_name,
new_dataset=new_dataset_name,
workspace=workspace_name,
new_dataset_workspace=new_dataset_workspace_name,
lakehouse=lakehouse_name,
lakehouse_workspace=lakehouse_workspace_name
)
migration.migrate_calc_tables_to_semantic_model(
dataset=dataset_name,
new_dataset=new_dataset_name,
workspace=workspace_name,
new_dataset_workspace=new_dataset_workspace_name,
lakehouse=lakehouse_name,
lakehouse_workspace=lakehouse_workspace_name
)
migration.migrate_model_objects_to_semantic_model(
dataset=dataset_name,
new_dataset=new_dataset_name,
workspace=workspace_name,
new_dataset_workspace=new_dataset_workspace_name
)
migration.migrate_field_parameters(
dataset=dataset_name,
new_dataset=new_dataset_name,
workspace=workspace_name,
new_dataset_workspace=new_dataset_workspace_name
)
time.sleep(2)
labs.refresh_semantic_model(dataset=new_dataset_name, workspace=new_dataset_workspace_name)
migration.refresh_calc_tables(dataset=new_dataset_name, workspace=new_dataset_workspace_name)
labs.refresh_semantic_model(dataset=new_dataset_name, workspace=new_dataset_workspace_name)migration.migration_validation(
dataset=dataset_name,
new_dataset=new_dataset_name,
workspace=workspace_name,
new_dataset_workspace=new_dataset_workspace_name
)rep.report_rebind_all(
dataset=dataset_name,
dataset_workspace=workspace_name,
new_dataset=new_dataset_name,
new_dataset_workpace=new_dataset_workspace_name,
report_workspace=None
)report_name = '' # Enter report name which you want to rebind to the new Direct Lake model
rep.report_rebind(
report=report_name,
dataset=new_dataset_name,
report_workspace=workspace_name,
dataset_workspace=new_dataset_workspace_name)dfT, dfC, dfR = directlake.show_unsupported_direct_lake_objects(dataset = dataset_name, workspace = workspace_name)
print('Calculated Tables are not supported...')
display(dfT)
print("Learn more about Direct Lake limitations here: https://learn.microsoft.com/power-bi/enterprise/directlake-overview#known-issues-and-limitations")
print('Calculated columns are not supported. Columns of binary data type are not supported.')
display(dfC)
print('Columns used for relationship must be of the same data type.')
display(dfR)This will list any tables/columns which are in the new semantic model but do not exist in the lakehouse
directlake.direct_lake_schema_compare(dataset=new_dataset_name, workspace=new_dataset_workspace_name)
directlake.list_direct_lake_model_calc_tables(dataset=new_dataset_name, workspace=new_dataset_workspace_name)
https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fsemantic-link-labs%2Fblob%2Fmain%2Fnotebooks%2FMigration%2520to%2520Direct%2520Lake.ipynb
Thanks for sharing @mikova !
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.