Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
jpshepard
Advocate I
Advocate I

Concatenate field from dimension that is connected to fact table by a bridge table - large dataset

Hello, I have a dimension that can have many different descriptor tags, and I would like to concatenate these fields on the fly based on what the user has selected. There was recently a very similar question asked here (link) that resulted in a beautiful measure to solve this, but I am running into an issue when trying to apply this solution to a large dataset. That is, the visual is very slow and will often time out with this new measure. Keeping with the same theme as the previously asked question, I have thousands of workers, hundreds tags, and hundreds of buildings. The fact table has about 1 million rows. Are there improvements I can make to the model and/or measures to achieve better performance? I have attempted additional filtering of the tables in the measure, but this did not result in any improvement. Link to the sample pbix file - I would truly appreciate any insight on this matter! 

jpshepard_1-1730757143399.png

 

1 REPLY 1
Anonymous
Not applicable

Hi  @jpshepard ,

 

I can't open your PBIX file, regarding the slow loading of power bi desktop, you can try the following:

You can use DAX Studio to optimize the inspection of DAX statements from a performance perspective and use the Performance analyzer to check the performance of each of your report elements when users interact with them, and which aspects of performance consume the most (or least) resources

DAX Studio | DAX Studio

Use Performance Analyzer to examine report element performance in Power BI Desktop - Power BI | Micr...

 

Use a star schema design to simplify relationships and improve query performance:

DAX Optimization Tips for Large Models - Microsoft Fabric Community

 

Some suggestions for using the Dax function:

The summarize() function creates a table, which is a typically time-consuming operation, so you might consider optimizing performance by removing extended columns from summarize() and adding them using the addcolumns() function.

Optimize SUMMARIZE with ADDCOLUMNS in Dax #ssas #tabular #dax #powerpivot - SQLBI

Best practices using SUMMARIZE and ADDCOLUMNS - SQLBI

Optimizing DAX expressions involving multiple measures - SQLBI

 

You can optimize the data model and increase the default values of the corresponding options to optimize query performance and minimize the number of visual objects on the report, each of which increases load time.

Optimization guide for Power BI - Power BI | Microsoft Learn

Evaluation configuration settings for Desktop - Power BI | Microsoft Learn

 

There are also some suggestions for optimizing the model here:

  • Remove unused tables or columns, where possible. 
  • Avoid distinct counts on fields with high cardinality – that is, millions of distinct values.  
  • Take steps to avoid fields with unnecessary precision and high cardinality. For example, you could split highly unique datetime values into separate columns – for example, month, year, date, and so on. Or, where possible, use rounding on high-precision fields to lower cardinality – (for example, 13.29889 -> 13.3).
  • When connecting to data sources via Direct Query, consider indexing columns that are commonly filtered or sliced again. Indexing greatly improves report responsiveness. 

 

Best Regards,

Liu Yang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.