The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello everyone,
I am working on a report with a single table visual which has some 170 columns (including measures).
The data that I am getting is like following-
ID | Month_Year | KPI1 | KPI2 |
1 | Jan-2025 | 12 | 24 |
1 | Feb-2025 | 20 | 10 |
1 | Apr-2025 | 40 | 60 |
Now, based on this data and two filters (From Month and To Month), I am creating some measures to show the final data as
ID | KPI1 | KPI2 |
1 | Sum of values between selected months | Some logic defined below |
Some of the measures are normal -
You're working with a Power BI report that includes a single table visual displaying over 170 columns, many of which are complex DAX measures that use filters, date ranges, and conditional logic. These measures are calculated dynamically based on slicer selections (like "From Month" and "To Month") and involve row-level computations, including filters using TOPN, CALCULATE, and VAR logic. Because all these measures are evaluated simultaneously for each row in the visual, and the table has a wide structure, the report is consuming excessive memory and computational resources, resulting in the "Resources exceeded" error—especially since you're not using Premium capacity. Additionally, you have a filter condition (Date_Filter) that adds further complexity by evaluating each row's status and date range to conditionally include or exclude data.
To improve performance, it's recommended to break the large visual into smaller sections using bookmarks or multiple pages, thereby reducing the memory footprint per visual. Also, pre-aggregating data at the query level (using Power Query or SQL) or creating intermediate summary tables in DAX can help avoid expensive row-level filtering and repeated logic across measures. You can also optimize the measures by avoiding redundant use of FILTER and TOPN, and instead use simpler filter expressions within CALCULATE. If possible, move logic like Date_Filter into a calculated column during data load, which will help offload the logic from the visual evaluation at runtime. Finally, use the Performance Analyzer in Power BI Desktop to identify which specific measures are causing the most delay, and optimize those first. These steps should collectively help you avoid memory overload and improve responsiveness, even without Premium capacity.
Hi @PowerBIDAXProb
Thank you for reaching out to the Microsoft Fabric Community.
I have reproduced your scenario in Power BI using your logic for KPI calculations, From/To month filters, and the conditional status filtering. After implementing your measures and applying the visual-level filter based on Date_Filter, I was able to get the expected output as per your requirement.
For your reference, I’m attaching the .pbix file I used for this validation so you can review and adjust it to fit your full dataset.
If this information is helpful, please “Accept as solution” to assist other community members in resolving similar issues more efficiently.
Thank you.
I hope the information shared was helpful to you. If your question has been answered, kindly mark the most relevant reply as the Accepted Solution. This small action can make a big difference for others who are looking for the same solution.
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions
Hope everything’s going smoothly on your end. We haven’t heard back from you, so I wanted to check if the issue got sorted If yes, marking the relevant solution would be awesome for others who might run into the same thing.
Optimized Sum Measurevar FromMonth = MAX('From-Month/YearMaster'[FromMonthYearKey])
var ToMonth = MAX('To-Month/YearMaster'[ToMonthYearKey])
RETURN
CALCULATE (
SUM('Table'[KPI1]),
'Table'[MonthYearKey] >= FromMonth,
'Table'[MonthYearKey] <= ToMonth
)
Reduce the number of columns in the visual: Since you have 170 columns, try to reduce the number of columns displayed in the visual. Only include the necessary columns to minimize memory usage.
Use variables to store intermediate results: This can help in reusing the calculated values and avoid recalculating them multiple times.
Filter data early: Apply filters as early as possible in your calculations to reduce the amount of data being processed.
Simplify calculations: Simplify your DAX measures where possible. For example, avoid using complex nested calculations if they can be broken down into simpler steps.
Optimize the use of CALCULATE and FILTER: Ensure that you are using these functions efficiently. For example, avoid using FILTER inside CALCULATE if you can achieve the same result with simpler conditions.
Proud to be a Super User! |
|
Hey,
Thanks for replying. But, I think you have just formatted the same measures. Only change I see is in the second measure where you have added a new variable. Could you please tell if I am missing something?
Also, could you please edit your message to replace the table name with 'Table' for security reasons. My bad.. I mentioned it earlier in the main question
User | Count |
---|---|
27 | |
12 | |
8 | |
7 | |
5 |
User | Count |
---|---|
31 | |
15 | |
12 | |
7 | |
6 |