Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
FelipMark
Helper II
Helper II

Optimizing ABC Analysis in Power BI with Large Sales Data – Exceeding Resources Issue

Hi Power BI Community,

I'm working on an ABC analysis for our product sales data, but I keep running into a "Resources Exceeded" error whenever I try to create the DAX measure for this calculation. Our dataset is quite large, with detailed sales records at the product level. Here’s what I’ve attempted so far:

  1. Data Structure:

    • My sales table (SalesFact) has individual sales records, including columns like ProductDescription (product description), TotalItem (total sales amount per line), and Status (indicating if the sale is "Normal").
    • I've created a supporting table for the ABC classes with columns defining ranges (e.g., LowerLimit and UpperLimit for class limits).
  2. Current Measures:

    • I have a [Sales] measure to sum sales for "Normal" status:
       
      Sales = CALCULATE(SUM(SalesFact[TotalItem]), SalesFact[Status] = "Normal")

    • To calculate the cumulative sales percentage (ParetoPercentage), I’m dividing an accumulated sales measure (CumulativeSalesValue) by the total sales.

  3. Issues Encountered:

    • I've tried different ways to structure these measures, including creating cumulative totals and classifying products based on the cumulative percentage for Pareto analysis. However, each attempt ends up with the "Resources Exceeded" error, likely due to the detailed level of data and the complex filtering needed for each ABC class.

Question: Has anyone successfully performed an ABC analysis with large, detailed datasets in Power BI? I’d love to hear any methods or optimizations that worked for you, especially if there’s a more efficient DAX approach to handle cumulative calculations or class segmentation with high volumes of data.

Thank you in advance!

1 ACCEPTED SOLUTION
MattAllington
Community Champion
Community Champion

This type of run time calculation on large data will always be slow. I'm not surprised to hear of your issues. Anything you can do to remove unnecessary resource theft will help. I suggest remove the product description from the fact table and put it in a product table, ideally with an integer as the key for the relationship. Also, check the precision of the sales value column. Reducing precision should help, eg round to the nearest integer. Consider summarising the data across dimensions that don't matter for this calculation, eg if your data is at day level of granularity but you only do these calculations at a month level, consider creating a summarised table of data at the month level only for this calculation. 



* Matt is an 8 times Microsoft MVP (Power BI) and author of the Power BI Book Supercharge Power BI.
I will not give you bad advice, even if you unknowingly ask for it.

View solution in original post

6 REPLIES 6
sio2Graphs
Helper I
Helper I

Hello, 

 

You can use visual "Pareto+" to create a graph and table or "Pareto by sio2Graphs" to produce the graph.

The Total Cost is on the left with the cumulative % shown on the right and displayed on the arc.  The number of units is on the x-Axis with a random name.  I have put the Total units by Class and Percentage by Class in the Tooltips.

If you have hundreds of categories, remove the padding and margins and if the labels won’t even fit vertically, you can turn off the x-Axis.

 

sio2Graphs 😀

cynthia@sio2graphs.com

sio2graphs on AppSource 

 

ABC.png

Rupak_bi
Solution Sage
Solution Sage

Hi @FelipMark ,

I usually works on large process data in direct query mode and used to get this error. There is no specific method of DAX optimization. It's depending upon the data model and how you do query. One thing I will suggest, don't use any row level operation like sumx, maxx, minx etc. these consumes maximum resource. Hope you are not using sumx here for cumulative calculation.



Regards
Rupak
FOLLOW ME : https://www.linkedin.com/in/rupaksar/
MattAllington
Community Champion
Community Champion

This type of run time calculation on large data will always be slow. I'm not surprised to hear of your issues. Anything you can do to remove unnecessary resource theft will help. I suggest remove the product description from the fact table and put it in a product table, ideally with an integer as the key for the relationship. Also, check the precision of the sales value column. Reducing precision should help, eg round to the nearest integer. Consider summarising the data across dimensions that don't matter for this calculation, eg if your data is at day level of granularity but you only do these calculations at a month level, consider creating a summarised table of data at the month level only for this calculation. 



* Matt is an 8 times Microsoft MVP (Power BI) and author of the Power BI Book Supercharge Power BI.
I will not give you bad advice, even if you unknowingly ask for it.

Thanks for your suggestions!

To optimize performance, I already aggregated my sales data in Power Query before loading it into Power BI. My transformation removes unnecessary columns, filters only "Normal" sales, and groups data at the store, product, and month level. Here’s the Power Query code I’m using:

let
    Source = fVendas,

    // 1. Filtering records before any transformation
    FilterNormal = Table.SelectRows(Source, each [situacao] = "Normal"),

    // 2. Selecting only the necessary columns before processing the data
    SelectColumns = Table.SelectColumns(FilterNormal, {"DATA_EMISS", "lkpdv", "LKEMPRESA", "LKPRODUTO", "TotalItem", "QUANTIDADE"}),

    // 3. Creating Month and Year columns (keeping DATA_EMISS as a date)
    AddMonth = Table.AddColumn(SelectColumns, "Month", each Date.Month([DATA_EMISS]), Int64.Type),
    AddYear = Table.AddColumn(AddMonth, "Year", each Date.Year([DATA_EMISS]), Int64.Type),

    // 4. Creating a Reference Date column (first day of the month)
    AddReferenceDate = Table.AddColumn(AddYear, "ReferenceDate", each #date([Year], [Month], 1), type date),

    // 5. Grouping data to reduce volume before further operations, including QUANTIDADE (Quantity)
    GroupedData = Table.Group(AddReferenceDate, {"Year", "Month", "ReferenceDate", "lkpdv", "LKEMPRESA", "LKPRODUTO"}, 
        {{"TotalValue", each List.Sum([TotalItem]), type number},
         {"TotalQuantity", each List.Sum([QUANTIDADE]), type number}}),
    
    // 6. Changing data type for better usability
    #"Changed Type" = Table.TransformColumnTypes(GroupedData,{{"TotalValue", Currency.Type}})
in
    #"Changed Type"

 

Even after summarizing the data, I still run into the "Resources Exceeded" error when calculating cumulative sales for the Pareto percentage in my ABC analysis.

Given that my sales table has over 10 million rows and the product table contains more than 20,000 products, do you have any additional suggestions?

Would a different DAX approach or another pre-aggregation step help optimize cumulative calculations at scale?

Thank you for the suggestion! I really appreciate the insights. I’ll definitely try creating a summarized table at the month level since we typically view this indicator at a monthly minimum. This should hopefully reduce the strain on resources and allow for smoother calculations. Thanks again for pointing me in the right direction!

Anonymous
Not applicable

Hi @FelipMark ,

Have you solved your problem? If so, can you share your solution here and mark the correct answer as a standard answer to help other members find it faster? Thank you very much for your kind cooperation!

 

 

Best Regards

Yilong Zhou

Helpful resources

Announcements
July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.