Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
SteveD2
Resolver I
Resolver I

create a virtual table for multiple measures

Hi again,
I might be viewing the problem incorrectly or there's another function I'm yet to familiarise myself with but I have a table of 65,000 uniquely identified locations with Latitude and Longitude coordinates (high cardinality). I reduce the table to a smaller number of rows by filtering the Latitude and Longitude fields directly and then pass the reduced dataset to calculations to improve the performance of measures.

 

The problem is that filtering 65,000 rows to 500 can take up to 1.8sec, which isn't bad, but, there are 16 other measures that do exactly the same thing. That is, each measure filters the same table to the same data set which then performs a different computation which pushes out the load time for the visuals in the report to 32sec, which is too long.

 

This doesn't sound efficient to me and I was wondering if there was a way of creating a virtual table (the filtered data set) and rather than having the next measure go through the same filtering process, accesses the table already filtered by a preceding measure.

2 REPLIES 2
d_gosbell
Super User
Super User

The problem is that filtering 65,000 rows to 500 can take up to 1.8sec, which isn't bad

 

Actually this is terrible. The tabular engine can filter 65000 rows in milliseconds. I just did a very simple test and was able to count a subset of 14,000 rows from a table with 63,000 rows in 46 milliseconds, but most of that time was spent rendering the results. The actual filtering in the tabular engine only took 4 milliseconds (measured using the Server Timings in DAX Studio). Obviously you are doing more complicated logic that a simple filtered count but it still sounds like there might be some issues with your measures.

 

Have you seen this video from Marco Russo which shows how to isolated performance issues in Power BI Reports? It's worth having a look at this and maybe posting back here if you find some problem measures but are not sure what to try to optimize them.

Hi, thanks for the response and yes, I have seen the video. I've been able to improve the performance by including a Keepfilters statement in the filtering statement... I don't know why it works but, it just does. So now, the basic filtering statement looks like;

Calculatetable(
Values(Table[Index]), KeepFilters( Filter( Values(Table[Index]), Not(Isblank([Filter Test]))) ) )

and I use this to reduce the data set. However, I've since discovered that I have another table which I do a cross join which can create a virtual table of over 1,2M rows... Although I don't think this is an overly large table it is taking a long time to render on visuals.

The typical coding for this type of measure looks like this;

Var Table1=
Calculatetable(
Values(Table[Index]
// the table is filtered to reduce the data set size.

Var Table2 = 
Values(Table2[Index])

Var Final_Table =
Addcolumns(
Summarize(
Crossjoin( Table1, Table2),
Table1[Index],
Table2[Index]),
"Dist", [Distance between points measure])

Return
// as an example
Calculate(
Countrows(Final_Table),
Filter(
Final_Table,
[Dist] < x)

apart from reducing the row count in Table2 I'm not sure what else I can do. Any suggestions would be appreciated.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.