- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Migration to Direct Lake
Download this notebook from: semantic-link-labs/notebooks/Migration to Direct Lake.ipynb at main · microsoft/semantic-link-labs ·...
Watch the video below to see a walkthrough of the Direct Lake Migration process
Check here to see the latest version.
%pip install semantic-link-labs
Import the library and set initial parameters
import sempy_labs as labs from sempy_labs import migration, directlake import sempy_labs.report as rep dataset_name = '' #Enter the import/DQ semantic model name workspace_name = None #Enter the workspace of the import/DQ semantic model. It set to none it will use the current workspace. new_dataset_name = '' #Enter the new Direct Lake semantic model name new_dataset_workspace_name = None #Enter the workspace where the Direct Lake model will be created. If set to None it will use the current workspace. lakehouse_name = None #Enter the lakehouse to be used for the Direct Lake model. If set to None it will use the lakehouse attached to the notebook. lakehouse_workspace_name = None #Enter the lakehouse workspace. If set to None it will use the new_dataset_workspace_name.
Create the Power Query Template file
This encapsulates all of the semantic model's Power Query logic into a single file.
migration.create_pqt_file(dataset = dataset_name, workspace = workspace_name)
Import the Power Query Template to Dataflows Gen2
Open the OneLake file explorer and sync your files (right click -> Sync from OneLake)
Navigate to your lakehouse. From this window, create a new Dataflows Gen2 and import the Power Query Template file from OneLake (OneLake -> Workspace -> Lakehouse -> Files...), and publish the Dataflows Gen2.
Create the Direct Lake model based on the import/DQ semantic model
Calculated columns are not migrated to the Direct Lake model as they are not supported in Direct Lake mode.
import time labs.create_blank_semantic_model(dataset = new_dataset_name, workspace = new_dataset_workspace_name, overwrite=False) migration.migrate_calc_tables_to_lakehouse( dataset=dataset_name, new_dataset=new_dataset_name, workspace=workspace_name, new_dataset_workspace=new_dataset_workspace_name, lakehouse=lakehouse_name, lakehouse_workspace=lakehouse_workspace_name ) migration.migrate_tables_columns_to_semantic_model( dataset=dataset_name, new_dataset=new_dataset_name, workspace=workspace_name, new_dataset_workspace=new_dataset_workspace_name, lakehouse=lakehouse_name, lakehouse_workspace=lakehouse_workspace_name ) migration.migrate_calc_tables_to_semantic_model( dataset=dataset_name, new_dataset=new_dataset_name, workspace=workspace_name, new_dataset_workspace=new_dataset_workspace_name, lakehouse=lakehouse_name, lakehouse_workspace=lakehouse_workspace_name ) migration.migrate_model_objects_to_semantic_model( dataset=dataset_name, new_dataset=new_dataset_name, workspace=workspace_name, new_dataset_workspace=new_dataset_workspace_name ) migration.migrate_field_parameters( dataset=dataset_name, new_dataset=new_dataset_name, workspace=workspace_name, new_dataset_workspace=new_dataset_workspace_name ) time.sleep(2) labs.refresh_semantic_model(dataset=new_dataset_name, workspace=new_dataset_workspace_name) migration.refresh_calc_tables(dataset=new_dataset_name, workspace=new_dataset_workspace_name) labs.refresh_semantic_model(dataset=new_dataset_name, workspace=new_dataset_workspace_name)
Show migrated/unmigrated objects
migration.migration_validation( dataset=dataset_name, new_dataset=new_dataset_name, workspace=workspace_name, new_dataset_workspace=new_dataset_workspace_name )
Rebind all reports using the old semantic model to the new Direct Lake semantic model
rep.report_rebind_all( dataset=dataset_name, dataset_workspace=workspace_name, new_dataset=new_dataset_name, new_dataset_workpace=new_dataset_workspace_name, report_workspace=None )
Rebind reports one-by-one (optional)
report_name = '' # Enter report name which you want to rebind to the new Direct Lake model rep.report_rebind( report=report_name, dataset=new_dataset_name, report_workspace=workspace_name, dataset_workspace=new_dataset_workspace_name)
Show unsupported objects
dfT, dfC, dfR = directlake.show_unsupported_direct_lake_objects(dataset = dataset_name, workspace = workspace_name) print('Calculated Tables are not supported...') display(dfT) print("Learn more about Direct Lake limitations here: https://learn.microsoft.com/power-bi/enterprise/directlake-overview#known-issues-and-limitations") print('Calculated columns are not supported. Columns of binary data type are not supported.') display(dfC) print('Columns used for relationship must be of the same data type.') display(dfR)
Schema check between semantic model tables/columns and lakehouse tables/columns
This will list any tables/columns which are in the new semantic model but do not exist in the lakehouse
directlake.direct_lake_schema_compare(dataset=new_dataset_name, workspace=new_dataset_workspace_name)
Show calculated tables which have been migrated to the Direct Lake semantic model as regular tables
directlake.list_direct_lake_model_calc_tables(dataset=new_dataset_name, workspace=new_dataset_workspace_name)
https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fsemantic-link-labs%2Fblob%2Fmain%2Fnotebooks%2FMigration%2520to%2520Direct%2520Lake.ipynb
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for sharing @mikova !
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
