Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
daniele_tiles
Helper III
Helper III

How to migrate an existing semantic model to lakehouse

Hi to all,

we're experimenting with Microsoft Fabric and we have some questions:

 

  1. We have a report built with Power BI Desktop, with some M query trasformation inside. We'd like to create the same semantic model on a Fabric Lakehouse. The first question is: is it possible to leverage the original M query code for the final tables? At the moment we have just copied the tables on the lakehouse, but then we cannot use the M query editor on the Service to add additional steps, am I right? The only way is to do the same elaborations via Notebook on the final tables?
  2. How to switch the report from a model done with with Power BI Desktop to a semantic model on a Lakehouse?

Best regards

 

Daniele 

1 ACCEPTED SOLUTION

Hi @daniele_tiles ,

 

Of course you can switch your existing reports to your new fabric-powered semantic models. Nobody wants you to delete the old reports and create new, live-connected reports using the lakehouse semantic models. All your user's bookmarks to the old reports would break! What kind of outcry would that be from your users!

 

The process you are decribing is called "rebind".

 

First create your lakehouse-powered semantic models using your technology of choice (direkt lake, import, direct query, dual storage mode, composite model), but I'd recommend direct lake with lakehouse. The new semantic models must be compatible with the old semantic models, i.e., they must contain the same table names, the same column names, and the same measure names (and could introduce additional tables, columns, and fields).

 

For rebinding, there is a REST API:

Reports - Rebind Report - REST API (Power BI Power BI REST APIs) | Microsoft Learn

If you want to use PowerShell then you can use the Invoke-PowerBIRestMethod CmdLet to call the REST API:
Invoke-PowerBIRestMethod (MicrosoftPowerBIMgmt.Profile) | Microsoft Learn

 

Kind regards,

Martin

View solution in original post

6 REPLIES 6
v-yohua-msft
Community Support
Community Support

Hi, @daniele_tiles 

Recreate the data model in Lakehouse and copy the tables to Lakehouse. Make sure that all the necessary transformations are applied via Spark in the notebook or through the SQL Analytics endpoint.

Create a semantic model in Lakehouse and take advantage of Lakehouse's capabilities to define a semantic model. This may involve setting up relationships and measures directly in the Lakehouse environment. For detailed steps to create a semantic model, see the documentation on creating a lakehouse for Direct Lake:

Learn how to create a Lakehouse for Direct Lake in Power BI and Microsoft Fabric - Power BI | Micros...

 

Once you've prepared your semantic model in a lakehouse, you can use DirectQuery mode to connect Power BI to this model. This approach allows you to leverage real-time data and logic defined in Lakehouse. For guidance on connecting Power BI to an external data source, you can check the following link:

Access shared semantic models in Power BI as a guest user from an external organization (preview) - ...

 

How to Get Your Question Answered Quickly 

Best Regards

Yongkang Hua

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi @v-yohua-msft ,

what about switching existing report from Import\DQ to a semantic model made on a Lakehouse? Is it possible? How?

Thank you

 

Daniele

Hi @daniele_tiles ,

 

Of course you can switch your existing reports to your new fabric-powered semantic models. Nobody wants you to delete the old reports and create new, live-connected reports using the lakehouse semantic models. All your user's bookmarks to the old reports would break! What kind of outcry would that be from your users!

 

The process you are decribing is called "rebind".

 

First create your lakehouse-powered semantic models using your technology of choice (direkt lake, import, direct query, dual storage mode, composite model), but I'd recommend direct lake with lakehouse. The new semantic models must be compatible with the old semantic models, i.e., they must contain the same table names, the same column names, and the same measure names (and could introduce additional tables, columns, and fields).

 

For rebinding, there is a REST API:

Reports - Rebind Report - REST API (Power BI Power BI REST APIs) | Microsoft Learn

If you want to use PowerShell then you can use the Invoke-PowerBIRestMethod CmdLet to call the REST API:
Invoke-PowerBIRestMethod (MicrosoftPowerBIMgmt.Profile) | Microsoft Learn

 

Kind regards,

Martin

Thanks @Martin_D , I was worried that rebind wouldn't work, and I was hoping for something more "user-friendly". But it's a start for testing.

You can try it with a test report for the first time, but once you have the script for your tenant, all you need to do is replace report id and dataset id in the script and run. Done fast, that's what I'd call user friendly. I can't imagine any faster way. Even if there was some workflow with some graphical UI, if clicking around would take longer than running the script, I wouldn't call it more user friendly. Rebind works like a breeze!

daniele_tiles
Helper III
Helper III

OK maybe for the first point we can leverage the dataflows I get it

 

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

Top Solution Authors