Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
Hi,
I am looking into improving our BI setup by moving from using individual PBIX-files as datawarehouses to having a medallion setup resulting in a populated lakehouse with data instead. From this lakehouse, I will then make semantic models corresponding to each old PBIX-file we had.
I have close to 10 different data sources, some tiny and some larger (one of which is an SQL table with near 40 columns and 40 million rows).
I've made a semantic model, connected a PBIX to the model (by use of direct lake) and published the report. The issues I am now facing are:
1. While I accept that direct lake is slower than import, some rather simple queries vary between 30 sec and 2m30s to complete. Taking more than 30 sec is unacceptable for our setup.
2. The pricing is not too transparent. I haven't gone live with our new setup yet, but I am afraid that having 40+ people sending direct lake queries will quickly consume all of our available capacity units (currently on F4 where 50% of the resources are taken by non-interactive requests such as data load, transformations, etc). Moving to F8 is ok but if even that is not enough it will become too expensive.
To circumvent the slow queries we experience using direct lake I was trying to switch the mode to import but I don't know how to do this. I like the fact that the semantic models are created and stored online in our fabric capacity instead of in a local PBIX file. I also tried using a warehouse instead of a lakehouse but was only able to connect to the default semantic model and not any additional ones I made.
I am unsure if creating multiple warehouses will result in duplicate data or if the underlying data is somehow not duplicated although part of multiple differerent warehouses?
---
So I guess my question is: Can I make different semantic models on top of the same data but use import instead of direct lake / direct query.
Thanks in advance
------------------------------
Edit: Maybe I was not clear in my initial description of what I am looking to do.
Currently, we have several reports based on the same data models. We have split the data and model from the reporting, e.g.
Data model 1 has reports a, b, and c
Data model 2 has reports d and e.
This enables us to re-use the same data for several different reports if the semantic models and means we only have to import the data once per different semantic model.
However, as all of our data is already in our lakehouse, I want to know if I can make several different semantic models (online in our Fabric capacity, not in local PBIX-files) that I can import (live connection) to several different reports (similarly to our existing setup where several reports use the same data model with a live import connection).
Solved! Go to Solution.
You can create multiple Semantic Models connected to a Lakehouse.
You can also create multiple reports connected to each Semantic Model.
Here I have created multiple semantic models (some Direct Lake semantic models and some Import mode semantic models), all of them connected to a single Lakehouse.
The same can also be done for Data Warehouse:
When you create Import Mode semantic models, you can take full advantage of Power Query to customize each semantic model.
Import Mode semantic models mean you will get a copy of the data, but it isn't necessarily a problem.
I believe Import Mode semantic models will continue to be the most popular mode, based on the current functionalities of Import Mode vs. Direct Lake. Because the Import Mode semantic models are very customizable.
I think Import Mode semantic models need to be created in Power BI Desktop (or 3rd party tool like Tabular Editor). I don't think Import Mode semantic models can be created in web browser (Fabric) yet.
When you create Direct Lake semantic models, you have limited functionality to customize your semantic model.
On the plus side, Direct Lake semantic models do not create a copy of the data. Also, you don't need to refresh the semantic model, because it is directly connected to the Lakehouse tables.
Direct Lake semantic models can only be created in Fabric (web browser) at the moment, I think. (Maybe also in 3rd party tools like Tabular Editor? I haven't tested).
Perhaps it will be possible to create Direct Lake semantic models from Power BI Desktop in the future.
However, you can already create reports, which have Live Connection to a Direct Lake semantic model, in Power BI Desktop.
This article provides some further reading regarding Import Mode vs. Direct Lake: https://www.sqlbi.com/blog/marco/2024/04/06/direct-lake-vs-import-mode-in-power-bi/
Hope this helps to answer your question.
If not, please elaborate more on your requirements.
You can also create a composite semantic model which uses DirectQuery to connect to multiple other semantic models.
I think this composite semantic model needs to be created in Power BI Desktop (or 3rd party tool like Tabular Editor?).
Maybe the composite semantic model will be resource demanding, and not so good performance? I don't know, I have not tested it at scale.
Further reading:
https://learn.microsoft.com/en-us/power-bi/transform-model/desktop-composite-models#composite-models...
You can create multiple Semantic Models connected to a Lakehouse.
You can also create multiple reports connected to each Semantic Model.
Here I have created multiple semantic models (some Direct Lake semantic models and some Import mode semantic models), all of them connected to a single Lakehouse.
The same can also be done for Data Warehouse:
When you create Import Mode semantic models, you can take full advantage of Power Query to customize each semantic model.
Import Mode semantic models mean you will get a copy of the data, but it isn't necessarily a problem.
I believe Import Mode semantic models will continue to be the most popular mode, based on the current functionalities of Import Mode vs. Direct Lake. Because the Import Mode semantic models are very customizable.
I think Import Mode semantic models need to be created in Power BI Desktop (or 3rd party tool like Tabular Editor). I don't think Import Mode semantic models can be created in web browser (Fabric) yet.
When you create Direct Lake semantic models, you have limited functionality to customize your semantic model.
On the plus side, Direct Lake semantic models do not create a copy of the data. Also, you don't need to refresh the semantic model, because it is directly connected to the Lakehouse tables.
Direct Lake semantic models can only be created in Fabric (web browser) at the moment, I think. (Maybe also in 3rd party tools like Tabular Editor? I haven't tested).
Perhaps it will be possible to create Direct Lake semantic models from Power BI Desktop in the future.
However, you can already create reports, which have Live Connection to a Direct Lake semantic model, in Power BI Desktop.
This article provides some further reading regarding Import Mode vs. Direct Lake: https://www.sqlbi.com/blog/marco/2024/04/06/direct-lake-vs-import-mode-in-power-bi/
Hope this helps to answer your question.
If not, please elaborate more on your requirements.
Hi frithjof_v,
Thanks again for your reply. I think you've answered my question now.
If I understand you correctly, if I have to create a semantic model using import mode based on data in a lakehouse, I have to connect to an endpoint of the lakehouse, load all data to a local PBIX before publishing it again? As mentioned, I was hoping I could avoid this by making the semantic model online and simply connecting to it as a live connection similarly to how reports can be connected to other published semantic models using import. But it seems this is not possible.
Yes, as far as I know, currently we cannot create an import mode semantic model in the user interface of the online service (Fabric, powerbi.com).
I think import mode semantic models have to be created in Power BI Desktop (or Tabular Editor, maybe some other tools are also possible) and then publish it to the online service.
After publishing the import mode semantic model to your Fabric workspace, you can create reports which are connected via live connection to your published import mode semantic model.
If you don't want to load the data into your pbix file locally, I think you can use parameters in your query and then change the parameter value online after publishing.
You could use the parameters to filter your data, or the parameter could point to a development lakehouse and switch the parameter to point to a production lakehouse after publishing the semantic model online.
Yes, in Power BI Desktop you can connect to the Lakehouse's SQL endpoint:
1. Select OneLake data hub
2. Select the Lakehouse you wish to use as a data source
3. Click the down arrow on the green Connect button
4. Click 'Connect to SQL endpoint'
5. Select tables and click Transform data
6. Choose between Import mode or DirectQuery mode.
It feels similar like connecting to a SQL database.
Alternative approach:
If you know the SQL connection string of the Lakehouse, you can use the SQL Server connector in Power BI desktop and paste the Lakehouse's SQL connection string in the Server field, and then proceed to step 5 and 6 above.
Hi frithjof_v,
Thank you for your quick reply!
Do you know if it is possible to re-use a semantic model when using the import mode from the SQL endpoint similarly to how one could connect with directlake/directquery to a semantic model made in a Fabric capacity?
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
Check out the November 2024 Fabric update to learn about new features.
User | Count |
---|---|
5 | |
4 | |
2 | |
1 | |
1 |
User | Count |
---|---|
14 | |
6 | |
5 | |
4 | |
3 |