Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hi Power BI Community,
I'm working on an ABC analysis for our product sales data, but I keep running into a "Resources Exceeded" error whenever I try to create the DAX measure for this calculation. Our dataset is quite large, with detailed sales records at the product level. Here’s what I’ve attempted so far:
Data Structure:
Current Measures:
Issues Encountered:
Question: Has anyone successfully performed an ABC analysis with large, detailed datasets in Power BI? I’d love to hear any methods or optimizations that worked for you, especially if there’s a more efficient DAX approach to handle cumulative calculations or class segmentation with high volumes of data.
Thank you in advance!
Solved! Go to Solution.
This type of run time calculation on large data will always be slow. I'm not surprised to hear of your issues. Anything you can do to remove unnecessary resource theft will help. I suggest remove the product description from the fact table and put it in a product table, ideally with an integer as the key for the relationship. Also, check the precision of the sales value column. Reducing precision should help, eg round to the nearest integer. Consider summarising the data across dimensions that don't matter for this calculation, eg if your data is at day level of granularity but you only do these calculations at a month level, consider creating a summarised table of data at the month level only for this calculation.
Hello,
You can use visual "Pareto+" to create a graph and table or "Pareto by sio2Graphs" to produce the graph.
The Total Cost is on the left with the cumulative % shown on the right and displayed on the arc. The number of units is on the x-Axis with a random name. I have put the Total units by Class and Percentage by Class in the Tooltips.
If you have hundreds of categories, remove the padding and margins and if the labels won’t even fit vertically, you can turn off the x-Axis.
sio2Graphs 😀
cynthia@sio2graphs.com
Hi @FelipMark ,
I usually works on large process data in direct query mode and used to get this error. There is no specific method of DAX optimization. It's depending upon the data model and how you do query. One thing I will suggest, don't use any row level operation like sumx, maxx, minx etc. these consumes maximum resource. Hope you are not using sumx here for cumulative calculation.
This type of run time calculation on large data will always be slow. I'm not surprised to hear of your issues. Anything you can do to remove unnecessary resource theft will help. I suggest remove the product description from the fact table and put it in a product table, ideally with an integer as the key for the relationship. Also, check the precision of the sales value column. Reducing precision should help, eg round to the nearest integer. Consider summarising the data across dimensions that don't matter for this calculation, eg if your data is at day level of granularity but you only do these calculations at a month level, consider creating a summarised table of data at the month level only for this calculation.
Thanks for your suggestions!
To optimize performance, I already aggregated my sales data in Power Query before loading it into Power BI. My transformation removes unnecessary columns, filters only "Normal" sales, and groups data at the store, product, and month level. Here’s the Power Query code I’m using:
let
Source = fVendas,
// 1. Filtering records before any transformation
FilterNormal = Table.SelectRows(Source, each [situacao] = "Normal"),
// 2. Selecting only the necessary columns before processing the data
SelectColumns = Table.SelectColumns(FilterNormal, {"DATA_EMISS", "lkpdv", "LKEMPRESA", "LKPRODUTO", "TotalItem", "QUANTIDADE"}),
// 3. Creating Month and Year columns (keeping DATA_EMISS as a date)
AddMonth = Table.AddColumn(SelectColumns, "Month", each Date.Month([DATA_EMISS]), Int64.Type),
AddYear = Table.AddColumn(AddMonth, "Year", each Date.Year([DATA_EMISS]), Int64.Type),
// 4. Creating a Reference Date column (first day of the month)
AddReferenceDate = Table.AddColumn(AddYear, "ReferenceDate", each #date([Year], [Month], 1), type date),
// 5. Grouping data to reduce volume before further operations, including QUANTIDADE (Quantity)
GroupedData = Table.Group(AddReferenceDate, {"Year", "Month", "ReferenceDate", "lkpdv", "LKEMPRESA", "LKPRODUTO"},
{{"TotalValue", each List.Sum([TotalItem]), type number},
{"TotalQuantity", each List.Sum([QUANTIDADE]), type number}}),
// 6. Changing data type for better usability
#"Changed Type" = Table.TransformColumnTypes(GroupedData,{{"TotalValue", Currency.Type}})
in
#"Changed Type"
Even after summarizing the data, I still run into the "Resources Exceeded" error when calculating cumulative sales for the Pareto percentage in my ABC analysis.
Given that my sales table has over 10 million rows and the product table contains more than 20,000 products, do you have any additional suggestions?
Would a different DAX approach or another pre-aggregation step help optimize cumulative calculations at scale?
Thank you for the suggestion! I really appreciate the insights. I’ll definitely try creating a summarized table at the month level since we typically view this indicator at a monthly minimum. This should hopefully reduce the strain on resources and allow for smoother calculations. Thanks again for pointing me in the right direction!
Hi @FelipMark ,
Have you solved your problem? If so, can you share your solution here and mark the correct answer as a standard answer to help other members find it faster? Thank you very much for your kind cooperation!
Best Regards
Yilong Zhou
Check out the July 2025 Power BI update to learn about new features.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
16 | |
9 | |
7 | |
7 | |
6 |
User | Count |
---|---|
23 | |
11 | |
10 | |
10 | |
8 |