The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a centralized semantic model published in a separate Fabric workspace called sematic model_wrokspace and create three reports based on the semantic model.
the semantic model is created by connecting with the SQLEndpoint of the lakehouse using the Power BI desktop. Recently I read about the data warehousing concept in Fabric, and a semantic model could be created inside the warehouse and modified according to business needs at the warehouse level, and no need to maintain a semantic model in this case.
1. Now I published a centralized semantic model through the PowerBI desktop, I am scheduling refresh daily and I could avoid this scheduling with a semantic model inside a data warehouse. Is this preferable?
2. I could create different layers of semantic models inside Datawarehouse. For example, if I need to create another model using the same warehouse is it possible?
Please let me know the what the natural path to follow from Lakehouse tables to Power BI and finally maintaining PowerBI reports inside the Power BI app in Fabric
Solved! Go to Solution.
Hi @anusha_2023
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.
Thanks.
Thanks for the reply. I need to go through the matearial and chose appropariate decisions accordingly. But I am happy with this information and closing the thread. thank you very much
Hi @anusha_2023 yes there is a "default" semantic model that is created when a lakehouse or warehouse is created. Plus you can create a custom semantic model in the service itself. At the moment it seems like you are importing into your semantic model.
In terms of options, yes you can use the default semantic model or create a custom semantic model in the service and this opens up the option to use DirectLake - this is the storage mode where you don't need to import data into a semantic model as it offers almost import-like performance. You can use this mode with Lakehouses and Warehouses. However there are caveats to using this storage mode: Learn about Direct Lake in Power BI and Microsoft Fabric - Power BI | Microsoft Learn
You can also model/edit your semantic model using Tabular Editor which will still allow Direct Lake Editing your Direct Lake Datasets from Tabular Editor? Yes please! - YouTube
At the moment you can't create a new semantic model using Power BI Desktop natively and publish to the service and enable Direct Lake, hopefully this will be supported soon. Please read the "Model write support with XMLA endpoint" to view the options in terms of deploying models to the service which can support DirectLake mode. Learn about Direct Lake in Power BI and Microsoft Fabric - Power BI | Microsoft Learn
In terms of your questions:
1. You can avoid scheduling a semantic model refresh by using DirectLake (please see DirectLake article above and take care to look at the limitations)
2. Yes you can create multiple semantic models using the same warehouse or lakehouse
Hi @anusha_2023
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.
Thanks.
User | Count |
---|---|
17 | |
9 | |
5 | |
3 | |
3 |
User | Count |
---|---|
46 | |
23 | |
17 | |
13 | |
12 |