Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Rafaela07
Frequent Visitor

Switch from Direct Lake to Import Mode

Hi everyone,

I'm facing the following challenge while trying to switch from direct lake mode to import mode.
Let me give some context.
I'm using Fabric to perform a medallion architecture (bronze & silver: Lakehouse, gold: Warehouse) for my data transformation, followed by a semantic model and reports that use this semantic model as a db.
The reports are created on service and the default mode is Direct Lake however I want to use import.
In the tutorials I've watched so far the recommendation is to connect from desktop and instead of choosing a live connection, connect to the desired lakehouse's or warehouse's sql endpoint like the screenshot shown below which was taken from a udemy course.

Rafaela07_0-1765787962573.png

While I tried to proceed with this methodology and it does indead work if you want to build your report from scratch (which to be honest I don't want to do), when I tried to do the same for an already built report with direct lake, it wasn't successfull.
The steps I followed:
1. Download the report from fabric (the only option available was "A copy of your report with a live connection to data online (.pbix)")
2. Open it in desktop by importing it (with a live connection)
3. Tried to follow the same process as before and failed.
When I'm searching in the onelake catalog the view that I have is the following.
I'm first getting the message below where I'm proceeding with the "Add a local model" selection.

Rafaela07_1-1765788777482.png
Then I'm directly propted to the semantic model already used in the report and I can't change anything, neither the source (eg going to the sql endpoint) nor the mode. The only options are select/diselect tables that exist in the sm and submit the changes.

Rafaela07_2-1765789354553.png

Is this doable? Does anyone have any recomendation regarding this that doesn't involve recreating the reports, yet just switching the storage mode?

Thank you in advance,

 

 

1 ACCEPTED SOLUTION

The warning happens because it requires a refresh operation since it is still directlake. Sorry for misunderstanding here. To convert the your model to Import mode you could use the following c# code through Tabular Editor after connecting to your model. After running it you should use Model --> Deploy option to deploy the newly converted model as a new model in a workspace.

 

using System.Reflection;

const string mImportTemplate =
@"let
Source = DatabaseQuery,
Data = Source{{[Schema=""{0}"",Item=""{1}""]}}[Data]
in
Data";

foreach (var table in Model.Tables)
{
if (table.Partitions.Count != 1) continue;

var partition = table.Partitions[0];
if (partition.Mode != ModeType.DirectLake) continue;

var pMetadataObjct =
typeof(Partition).GetProperty(
"MetadataObject",
BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.DeclaredOnly);

var tomPartition =
pMetadataObjct.GetValue(partition)
as Microsoft.AnalysisServices.Tabular.Partition;

var tomPartitionSource =
tomPartition.Source
as Microsoft.AnalysisServices.Tabular.EntityPartitionSource;

if (tomPartitionSource == null) continue;

var schemaName = tomPartitionSource.SchemaName;
var tableName = tomPartitionSource.EntityName;

var partitionName = partition.Name;
partition.Name += "_old";

table.AddMPartition(
partitionName,
string.Format(mImportTemplate, schemaName, tableName));

partition.Delete();
}

Model.Collation = null;
Model.DefaultMode = ModeType.Import;
Model.RemoveAnnotation("TabularEditor_DirectLake");

_________________________________________________________
If this helped, ✓ Mark as Solution | Kudos appreciated
Connect on LinkedIn

View solution in original post

9 REPLIES 9
cengizhanarslan
Solution Sage
Solution Sage

Hi,

The fastest and cleanest way to switch an existing Direct Lake semantic model to Import mode—without rebuilding reports—is by using a notebook-based approach in Microsoft Fabric, leveraging semantic-link-labs.

Below is a step-by-step solution.

 

Step 1: Install semantic-link-labs

Run the following in a Fabric notebook:

 

%pip install semantic-link-labs

Step 2: Duplicate the semantic model and switch it to Import mode

This step extracts the existing semantic model definition and recreates it as a new Import-mode model in the target workspace.

 

import sempy_labs as sl
# Source configuration
dataset = 'Model'
source_workspace = 'WS'
 
# Target configuration
new_dataset = 'Model_Import'
target_workspace = 'WS'
 
# Get the semantic model definition from the source workspace
model_bim = sl.get_semantic_model_definition(
dataset=dataset,
format='TMSL',
workspace=source_workspace,
return_dataframe=False
)
 
# Create the semantic model in the target workspace (Import mode)
sl.create_semantic_model_from_bim(
dataset=new_dataset,
bim_file=model_bim,
workspace=target_workspace
)

 

This effectively duplicates the Direct Lake semantic model as an Import model, without needing Power BI Desktop.


Step 3: Rebind existing reports to the new Import semantic model

Once the new model exists, you can rebind existing reports so they point to the new Import-mode dataset.

 
import sempy_labs.report as rep
# List of reports to rebind
reports = [ 'report1', 'report2' ]
 
# Rebind each report to the new semantic model
for report in reports:
rep.report_rebind(
report=report,
dataset='Model_Import',
report_workspace='WS',
dataset_workspace='WS'
)
print(f"✓ {report} successfully rebound")
_________________________________________________________
If this helped, ✓ Mark as Solution | Kudos appreciated
Connect on LinkedIn

Hi @cengizhanarslan 
This indead duplicates the semantic model however the tables inside still appear in direct lake and there is also a warning sign. So I'm don't think that this approach works

Rafaela07_0-1765793714428.png

 

The warning happens because it requires a refresh operation since it is still directlake. Sorry for misunderstanding here. To convert the your model to Import mode you could use the following c# code through Tabular Editor after connecting to your model. After running it you should use Model --> Deploy option to deploy the newly converted model as a new model in a workspace.

 

using System.Reflection;

const string mImportTemplate =
@"let
Source = DatabaseQuery,
Data = Source{{[Schema=""{0}"",Item=""{1}""]}}[Data]
in
Data";

foreach (var table in Model.Tables)
{
if (table.Partitions.Count != 1) continue;

var partition = table.Partitions[0];
if (partition.Mode != ModeType.DirectLake) continue;

var pMetadataObjct =
typeof(Partition).GetProperty(
"MetadataObject",
BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.DeclaredOnly);

var tomPartition =
pMetadataObjct.GetValue(partition)
as Microsoft.AnalysisServices.Tabular.Partition;

var tomPartitionSource =
tomPartition.Source
as Microsoft.AnalysisServices.Tabular.EntityPartitionSource;

if (tomPartitionSource == null) continue;

var schemaName = tomPartitionSource.SchemaName;
var tableName = tomPartitionSource.EntityName;

var partitionName = partition.Name;
partition.Name += "_old";

table.AddMPartition(
partitionName,
string.Format(mImportTemplate, schemaName, tableName));

partition.Delete();
}

Model.Collation = null;
Model.DefaultMode = ModeType.Import;
Model.RemoveAnnotation("TabularEditor_DirectLake");

_________________________________________________________
If this helped, ✓ Mark as Solution | Kudos appreciated
Connect on LinkedIn

Hi @Rafaela07   ,

Thanks for reaching out to the Microsoft fabric community forum. 

 

I would also take a moment to thank  @cengizhanarslan   , for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you .

 

 

Best Regards, 
Community Support Team

Hi @Rafaela07 ,

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you .

 

 

Best Regards, 
Community Support Team

amitchandak
Super User
Super User

@Rafaela07 , For Import, connect using the SQL Server or Azure SQL connector with the SSO/Entra login option. You can get the URL from the settings of the SQL Analytics endpoint.

 

Workspace -> SQL analytics End Point -> Ellipsis icon -> SSMS 

amitchandak_0-1765791654591.png

 

Share with Power BI Enthusiasts: Full Power BI Video (20 Hours) YouTube
Microsoft Fabric Series 60+ Videos YouTube
Microsoft Fabric Hindi End to End YouTube

Hi @amitchandak
I tried that too but the issue persists. It doesn't even prompt me in a pop up to enter the endpoint, yet the ones I've already listed above.

@Rafaela07 , New Blank Report -> SQL Server connection in Power BI

amitchandak_0-1765792079117.png

Import in first screen itself 

Microsoft account 

amitchandak_1-1765792096875.png

You will see table, Connection is Import are start 

amitchandak_2-1765792162181.png

 

Share with Power BI Enthusiasts: Full Power BI Video (20 Hours) YouTube
Microsoft Fabric Series 60+ Videos YouTube
Microsoft Fabric Hindi End to End YouTube

Hi @amitchandak 
But it this situation I'll have to recreate the report, correct? There is no way to "import" the existing one eg as a template.

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.