Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
JaromG
Frequent Visitor

Discussion: Alternative DAX Functions for Faster Performance

Hi everyone,

 

I haven't found any good articles on this and maybe it is because its a "It Depends" scenario. So I wanted to have a discussion with you all. What dax functions perform faster than a comparable alternative. For example, I recently learned that SUMX is a row level function and is slower than doing a Calculate(SUM()). The assumption for this discussion is that whatever functions we are comparing CAN be achieved using either method. I just want to know some dax function alternatives that could speed up the performance of my models! 

 

Thanks everyone and look forward to your feedback/responses. 

1 ACCEPTED SOLUTION
Imrans123
Advocate V
Advocate V

Personally. I think it's more important to figure out your data model before going into DAX. For example when you are creating VAR in a fact table, your DAX Evaluation will be much slower than if you were doing it on a dimension table since the row count is much lower on the dim. Or any other DAX evaluation for that matter. Reading up on star schema will help you there and I think that's one of the major contributors to more efficient evaluations

 

Another thing I recently disocvered while working with large datasets is using integers to filter rather than string. For example, let's say my dataset has sales and then customers that are either Gold, Silver or Platinum tier. If I want to filter via CALCULATE, rather than using Customer[Tier]="Platinum" or Customer[Tier]="Gold"... I'll just create a custom integer column on transform data where Silver=1, Gold=2, PLatinum=3... and then Customer[Tierint]=1 and so on.

View solution in original post

2 REPLIES 2
vinayganit
Frequent Visitor

For Large Dataset (1M+ rows), using COUNTROWS & VALUES in combination may put less strain on the DAX engine than DISTINCTCOUNT

This is one point I got from a Udemy Course , I am looking for similar perfromance based alternatives now

Imrans123
Advocate V
Advocate V

Personally. I think it's more important to figure out your data model before going into DAX. For example when you are creating VAR in a fact table, your DAX Evaluation will be much slower than if you were doing it on a dimension table since the row count is much lower on the dim. Or any other DAX evaluation for that matter. Reading up on star schema will help you there and I think that's one of the major contributors to more efficient evaluations

 

Another thing I recently disocvered while working with large datasets is using integers to filter rather than string. For example, let's say my dataset has sales and then customers that are either Gold, Silver or Platinum tier. If I want to filter via CALCULATE, rather than using Customer[Tier]="Platinum" or Customer[Tier]="Gold"... I'll just create a custom integer column on transform data where Silver=1, Gold=2, PLatinum=3... and then Customer[Tierint]=1 and so on.

Helpful resources

Announcements
July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.