Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

October 28 & 29: Experts share their secrets on how to pass the Fabric Analytics Engineer certification exam—live. Learn more

Reply
DavidK
Frequent Visitor

Filter

Hi,

please would you be so kind and help me with my dax code. I Can´t understend why i don t have enought memory.


My code is:

Column = COUNTROWS(FILTER(Sales_table,Sales_table[Date_issued_X]<=Combination[Date]&&Sales_table[ID]=Combination[ID])))


but when i use only one condition for example:


Column = COUNTROWS(FILTER(Sales_table,Sales_table[Date_issued_X]<=Combination[Date]))


i won ´t have problem wit memory, but when i use both condition in filter i receive this information:


Memory error: Memory Allocation failure . Try simplifying or reducing the number of queries.

 

Thank you in advance.

Dávid

1 ACCEPTED SOLUTION

HI @DavidK,

This should more be related to your data amounts.  After I finish checked your sample file and I found they have 81M and 40K records.

For this scenario, your formula will iterator and calculate through over more than 3240B(81M x 40K) rows. It obviously will spend a huge amount of system resources to calculate. 

If you add more conditions to your expression, it will cost more resources than fewer conditions version. (it looks like your device are good enough to process the simple version but failed on more condition versions)

In my opinion, I'd like to suggest you use measure expression instead of the calculated column formula, it only calculates when you use it to analyze visualizations. 

In addition, you can also try to do some pre-aggregate on your result to reduce the looping range of records, then you can calculate on fewer records to avoid the 'memory allocation' issue.

Notice: 1GB memory equal to 1,048,576KB. If we suppose each 100K rows looping will spend 1 KB memory, 3240B rows also required 31 GB memory to quick cache and calculate on the expanded range of records or spend more time to execute with idle resources until calculation timeout. (the real scenario and calculation may not such complex or poor performance than I supposed)

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.

View solution in original post

5 REPLIES 5
v-shex-msft
Community Support
Community Support

Hi @DavidK,

How many idle memory your device has, did you work on the 32-bit desktop client?AFAIK, 32bit has 4GB memory limit on application usage:

Memory Limits for Windows and Windows Server Releases 

I think this should be related to iterator calculations, you're formula will calculate on each row in the 'Combination' table to iterate and check with sale tables records. ('combination' row count X 'sales' table row count)

Optimizing nested iterators in DAX  

BTW, you can also use Dax studio to trace how many rows DAX formula calculate across:

DAX Studio 

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.

I can ´t understand why it doesn ´ work corect 😞 ....i add here example with training data. Please would you be so kind and check where i do mistake 😞

https://drive.google.com/drive/folders/1XoFc1e0odTNQ1SiNO6hIV1-9F-gGpjrD?usp=sharing 

 

Thank you in advance.

HI @DavidK,

This should more be related to your data amounts.  After I finish checked your sample file and I found they have 81M and 40K records.

For this scenario, your formula will iterator and calculate through over more than 3240B(81M x 40K) rows. It obviously will spend a huge amount of system resources to calculate. 

If you add more conditions to your expression, it will cost more resources than fewer conditions version. (it looks like your device are good enough to process the simple version but failed on more condition versions)

In my opinion, I'd like to suggest you use measure expression instead of the calculated column formula, it only calculates when you use it to analyze visualizations. 

In addition, you can also try to do some pre-aggregate on your result to reduce the looping range of records, then you can calculate on fewer records to avoid the 'memory allocation' issue.

Notice: 1GB memory equal to 1,048,576KB. If we suppose each 100K rows looping will spend 1 KB memory, 3240B rows also required 31 GB memory to quick cache and calculate on the expanded range of records or spend more time to execute with idle resources until calculation timeout. (the real scenario and calculation may not such complex or poor performance than I supposed)

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.
Pragati11
Super User
Super User

Hi @DavidK ,

 

Just want to check that in your data do you have rows of data where both of these conditions meet simultaneously?

 

Thanks,

Pragati

 

 

Best Regards,

Pragati Jain


MVP logo


LinkedIn | Twitter | Blog YouTube 

Did I answer your question? Mark my post as a solution! This will help others on the forum!

Appreciate your Kudos!!

Proud to be a Super User!!

yes, second condiotion should reduce count row

Helpful resources

Announcements
OCT PBI Update Carousel

Power BI Monthly Update - October 2024

Check out the October 2024 Power BI update to learn about new features.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.