Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Your file has been submitted successfully. We’re processing it now - please check back in a few minutes to view your report.
08-22-2025 10:53 AM - last edited 08-22-2025 11:45 AM
This notebook deploys a Composite Semantic Model based on Direct Lake and Import storage modes. Import tables can come from any supported data source and relationships between Direct Lake on OneLake and Import tables are regular relationships. Small dimension or lookup tables already in Direct Lake storage mode can instead use import storage mode, giving you the option to extend the table with calculated columns and structuring the table with hierarchies for use in Power BI reports and Excel pivot tables. The feature was announced in Power BI May 2025 release.
Requirements:
The only mandatory library is Semantic Link Labs, a Python library designed for use in Microsoft Fabric notebooks. This library extends the capabilities of Semantic Link offering additional functionalities to seamlessly integrate and work alongside it. In this notebook, Semantic Link Labs is used to capture the reference models definition files and to create/update the Composite Model.
Remaining libraries are optional, and are used to export the Composite Model definition file to a Lakehouse.
# Install Semantic Link Labs
%pip install semantic-link-labs
# Required Library
import sempy_labs as labs
# Optional Libraries
import json
from notebookutils import fs
All the parameters expect the name or the id of the object, except the import_tables parameter. This one requires a list of names of the tables that will be in Import mode.
# Required parameters
workspace = 'ws_demo' # Reference workspace
dataset_import = 'sm_import' # Source: Import model
dataset_directlake ='sm_directlake' # Source: Direct Lake model
dataset_composite = 'sm_composite' # Sink: Composite model
import_tables = ['table1', 'table2', 'table3'] # List of tables that will be set to Import mode
# Optional parameters
storage = 'lh_demo' # Lakehouse to ouput the Composite Model definition file
Process workflow:
Existing relationships on the Direct Lake model are persisted. It will fail if the column names are different (between Direct Lake and Import models).
# Get Model Definition file (BIM) from source models
bim_import = labs.get_semantic_model_bim( dataset = dataset_import, workspace = workspace )
bim_directlake = labs.get_semantic_model_bim( dataset = dataset_directlake, workspace = workspace )
# Delete the tables that will change storage mode from the Direct Lake file
bim_directlake['model']['tables'] = [
table for table in bim_directlake['model']['tables']
if table['name'] not in import_tables
]
# Include the tables with changed storage mode to Import in the Direct Lake file
for table in bim_import['model']['tables']:
if table['name'] in import_tables:
bim_directlake['model']['tables'].append(table)
The deployment is done via XMLA endpoint. It will create or update the model if it already exists. After the deployment, a new connection will be displayed in the Gateway and Cloud Connections section of the Semantic Model Properties. A rebind of the Import tables connection is required, since it cannot rely on Single Sign On as Direct Lake does (this is only required after the first deployment).
After the rebind the semantic model can be refreshed.
# Create/Update the Composite Model
try:
labs.create_semantic_model_from_bim( dataset = dataset_composite, bim_file = bim_directlake, workspace = workspace )
except:
labs.update_semantic_model_from_bim( dataset = dataset_composite, bim_file = bim_directlake, workspace = workspace )
The steps below are optional.
# Retrieves the list of connections visible in the Fabric environment
connections = labs.list_connections()
# Retrieves the list of connection dependencies in the Semantic Model
model_connections = labs.list_item_connections( item_name = dataset_composite, item_type = 'SemanticModel', workspace = workspace )
# Connections that do not rely on AzureDataLakeStorage (Direct Lake on Onelake)
connection_ids = model_connections[ model_connections['Connection Type'] != 'AzureDataLakeStorage' ]['Connection Id']
for con in connection_ids:
connection_path = model_connections[ model_connections['Connection Id'] == con ]['Connection Path'].iloc[0]
if con:
connection_name = connections[ connections['Connection Id'] == con ]['Connection Name'].iloc[0]
print(f'The path {connection_path} has been mapped to the {connection_name} connection.')
else:
print(f'No connection found for path {connection_path}. Bind the connection before refreshing the semantic model.')
# Save model definition to Lakehouse
path = f'abfss://{workspace}@onelake.dfs.fabric.microsoft.com/{storage}.Lakehouse/Files/{dataset_composite}.json'
file = json.dumps( bim_directlake, indent = 2 )
fs.put( path, file, overwrite = True )
https%3A%2F%2Fgithub.com%2Fdiego-dsanalytics%2Ffabric-notebooks%2Fblob%2Fmain%2FFiles%2Fcomposite_model.ipynb
Outstanding work!