Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
Thejeswar
Community Champion
Community Champion

Power BI in Microsoft Fabric - What should we expect?

Hello All,

I have been using Fabric for couple of weeks now. From the view of a person who has been using Power BI Desktop for many years, the Power BI view in Fabric seems to miss a lot of features such as Field Parameters, Calculation Groups when we use the Direct Lake Semantic Model.

 

The Most basic feature of creating Calculated Columns in also not available in Direct Lake Semantic Model. One other thing that I notice missing is that, Direct Lake Models cannot be used as Import Storage Model even when creating a model locally from it. This Local model becomes only a Direct Query model.

 

This feels like taking the biggest powers of Power BI Desktop away from the developers when using Fabric.

 

I would like the thoughts of many of you who are already well into Fabric on "What are the other things that I should expect to be not available when using Direct Lake Semantic Model while the same is available in Power BI Import or Direct Query mode."?

 

Thanks,

1 ACCEPTED SOLUTION
MFelix
Super User
Super User

Hi @Thejeswar ,

 

There is some things concerning fabric that need to be taken into account before doing this comparision especially when refering to the direct lake mode.

 

Fabric stores the data in Parquet files an open format that allows to have better compression and faster querying,

the use of direct lake takes advantage of the parquet files and instead of converting the DAX queries into SQL queries like you do in Direct query the Parquet files are read directly and the result is return.

 

However since the files are read directly you should be aware that the only way to have information on the parquet files is by doing the ETL process in the lakehouse/warehouseusing the different options. 

 

Calculated Columns and tables use DAX and they cannot be created in the parquet files in the one drive that is why you cannnot create them currently in the parquet files because there is no option to "load" the new data into the parquet files, if you however create an import model based on your onelake data then you can create the columns but won't be direct lake.

 

Second part that you reder concerning the Local model Direct lake is to be created anbd used online because it takes advantage of the onelake properties and the parquet files it only read the metadata of the tables and where the files are located, if you enable the XMLA endpoint you can edit the Direct lake models on the desktop but be aware that all the changes will be automatically written to the service and you can't on that file do a report and store it locally.

 

Concerning field parameters, you cannot directly create them on your model but you can use an external tool like Tabular Editor or to create them, since altough they are witten in DAX they are not materialize in any data values you have on your onelake, but rather in paths to your model (columns, measures, and so on).

 

Direct Lake is not about taking away the "biggest powers" it's a completly different way to look at your modelling, it's mainly used for very large dataset (billions of rows) where you need to be as fast as import but without the latency.

 

Be aware that Direct Lake and Fabric are evolving so a lot can change in the future.

 

See this links about Direct Lake mode to get other ideas.

https://www.sqlbi.com/blog/marco/2024/04/06/direct-lake-vs-import-mode-in-power-bi/

https://www.linkedin.com/pulse/50-shades-direct-lake-everything-you-need-know-new-power-nikola-ilic-...

https://www.youtube.com/watch?v=eJVYmDq5YCw

 


Regards

Miguel Félix


Did I answer your question? Mark my post as a solution!

Proud to be a Super User!

Check out my blog: Power BI em Português



View solution in original post

2 REPLIES 2
v-kongfanf-msft
Community Support
Community Support

Hi @Thejeswar ,

 

As a supplement: Their products are positioned differently, Delta Lake's architecture is designed to be interoperable with Delta tables created by other tools and vendors. This openness and compatibility makes Delta Lake a flexible and powerful data storage solution.
Its limitations compared to desktop are largely due to the nature of the underlying storage system and Delta Lake's design choices. However, the functionality and features of the Direct Lake semantic model are constantly evolving to produce better compatibility with desktop.

vkongfanfmsft_0-1734422376152.png

And compares Direct Lake storage mode to Import and DirectQuery storage modes:

vkongfanfmsft_1-1734422425810.png

 

For more details, you can refer to below document:

Direct Lake overview - Microsoft Fabric | Microsoft Learn

Use composite models in Power BI Desktop - Power BI | Microsoft Learn

 

Best Regards,
Adamk Kong

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

MFelix
Super User
Super User

Hi @Thejeswar ,

 

There is some things concerning fabric that need to be taken into account before doing this comparision especially when refering to the direct lake mode.

 

Fabric stores the data in Parquet files an open format that allows to have better compression and faster querying,

the use of direct lake takes advantage of the parquet files and instead of converting the DAX queries into SQL queries like you do in Direct query the Parquet files are read directly and the result is return.

 

However since the files are read directly you should be aware that the only way to have information on the parquet files is by doing the ETL process in the lakehouse/warehouseusing the different options. 

 

Calculated Columns and tables use DAX and they cannot be created in the parquet files in the one drive that is why you cannnot create them currently in the parquet files because there is no option to "load" the new data into the parquet files, if you however create an import model based on your onelake data then you can create the columns but won't be direct lake.

 

Second part that you reder concerning the Local model Direct lake is to be created anbd used online because it takes advantage of the onelake properties and the parquet files it only read the metadata of the tables and where the files are located, if you enable the XMLA endpoint you can edit the Direct lake models on the desktop but be aware that all the changes will be automatically written to the service and you can't on that file do a report and store it locally.

 

Concerning field parameters, you cannot directly create them on your model but you can use an external tool like Tabular Editor or to create them, since altough they are witten in DAX they are not materialize in any data values you have on your onelake, but rather in paths to your model (columns, measures, and so on).

 

Direct Lake is not about taking away the "biggest powers" it's a completly different way to look at your modelling, it's mainly used for very large dataset (billions of rows) where you need to be as fast as import but without the latency.

 

Be aware that Direct Lake and Fabric are evolving so a lot can change in the future.

 

See this links about Direct Lake mode to get other ideas.

https://www.sqlbi.com/blog/marco/2024/04/06/direct-lake-vs-import-mode-in-power-bi/

https://www.linkedin.com/pulse/50-shades-direct-lake-everything-you-need-know-new-power-nikola-ilic-...

https://www.youtube.com/watch?v=eJVYmDq5YCw

 


Regards

Miguel Félix


Did I answer your question? Mark my post as a solution!

Proud to be a Super User!

Check out my blog: Power BI em Português



Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.