Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hello,
As the title states, I'm asking for advice on how I can make this DAX code more efficient. I have limited coding experience, let alone with DAX. Thank you in advance 🙂
ChiefsMonthlyPosByLoc =
VAR _totalLOC =
CALCULATE(
COUNT('RT YTD July 2023'[scalevalue_ispositive]),
FILTER(
ALLSELECTED('RT YTD July 2023'),
'RT YTD July 2023'[scalevalue_reporttext] <> "Non-response"
&& [Month Name] = MAX ('RT YTD July 2023'[Month Name])
&& 'RT YTD July 2023'[location_name] = MAX ('RT YTD July 2023'[location_name])
)
)
VAR _posLOC =
CALCULATE(
COUNT('RT YTD July 2023'[scalevalue_ispositive]),
FILTER(
ALLSELECTED('RT YTD July 2023'),
'RT YTD July 2023'[scalevalue_ispositive] = "True"
&& [Month Name] = MAX ('RT YTD July 2023'[Month Name])
&& 'RT YTD July 2023'[location_name] = MAX ('RT YTD July 2023'[location_name])
)
)
VAR _pospercentLOC = _posLOC/_totalLOC
VAR _total =
CALCULATE(
COUNT('RT YTD July 2023'[scalevalue_ispositive]),
FILTER(
ALLSELECTED('RT YTD July 2023'),
'RT YTD July 2023'[scalevalue_reporttext] <> "Non-response"
&& [Month Name] = MAX ('RT YTD July 2023'[Month Name])
)
)
VAR _pos =
CALCULATE(
COUNT('RT YTD July 2023'[scalevalue_ispositive]),
FILTER(
ALLSELECTED('RT YTD July 2023'),
'RT YTD July 2023'[scalevalue_ispositive] = "True"
&& [Month Name] = MAX ('RT YTD July 2023'[Month Name])
)
)
VAR _pospercent = _pos/_total
RETURN IF(HASONEVALUE('RT YTD July 2023'[location_name]),_pospercentLOC,_pospercent)
Result based on the data you provided not the screenshot from the live data.
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
January | February | March | April | |
(+)Q1 | ||||
Location1 | 100% | 0% | ||
Location2 | 0% |
When collapsed, it should return the monthly positive percent across the whole enterprise for that question and not just an average based on the data currently in the matrix.
@rgu101 one thing I can tell you ths tha I g one large table is never.a good idea for Power Bi model.
2nd, based on the sample you provided, what is the expected output ?
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
I see, so I'll need to make or import multiple tables to break it up for efficiency?
The screenshots I provided of the table are the correct expected values based on the DAX code I wrote.
@rgu101 instead of sharing DAX code, you should share a sample pbix file with the data model and explain what you are trying to achieve. Read this post to get your answer quickly.
https://community.powerbi.com/t5/Community-Blog/How-to-Get-Your-Question-Answered-Quickly/ba-p/38490
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
Due to sensitive information, I'm unable to provide a sample pbix file. However, the data model is very simple since it's one large table that contains survey responses.
Here is an abridged example of the data table:
Date | Question Group | Question | Outreach ID | Location | Positive Response |
January | A | Q1 | aaa111 | Location1 | True |
January | A | Q2 | aaa111 | Location1 | False |
January | A | Q3 | aaa111 | Location1 | True |
March | B | Q6 | bbb222 | Location3 | True |
March | B | Q7 | bbb222 | Location3 | True |
March | B | Q8 | bbb222 | Location3 | True |
April | A | Q1 | ccc333 | Location1 | False |
April | A | Q2 | ccc333 | Location1 | False |
April | A | Q3 | ccc333 | Location1 | True |
April | C | Q1 | ddd444 | Location2 | False |
April | C | Q5 | ddd444 | Location2 | False |
April | C | Q4 | ddd444 | Location2 | False |
There are different question groups that may have common questions with other question groups. And each grouping of rows is based on the outreach ID to distinguish unique survey responders. The goal of my DAX code is to produce a matrix that returns the percentage of positive responses by location.
However, I also want it to show the positive response rate for the whole enterprise after collapsing the top Question row (hence the HASONEVALUE function since it doesn't properly return the positive response rate for the whole enterprise without it).
Currently, all the values in the individual cells are produced by a calculated measure with the DAX code, so I'm asking for advice on how to reduce run-time and CPU usage.
Thank you!
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
105 | |
99 | |
99 | |
38 | |
37 |
User | Count |
---|---|
156 | |
121 | |
73 | |
73 | |
63 |