The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
I'm working with a measure defined as:
GLOS =
VAR Number = [GProduct]
VAR ThisRoot = [IP_Count]
RETURN
Number ^ (1 / ThisRoot)
[GProduct] is the result of a PRODUCT(Numeric Column) calculation.
The [IP_Count] in the ThisRoot variable is a basic row count.
Both are typed as whole numbers.
Formula returns the nth root of the product.
This works, up until ThisRoot (the n) gets to be somewhere over 1,047 and less than 1200. (1,047 being an n that worked in the total count across several months, and 1200 being an n that failed adding the next month in)
I'll assume for a moment that (17,3) represents a size limit for a decimal number (17,3).
How can I adjust that size limit? I will frequently need to present this calculation that has an n that may be up to 10,000 or so.
Solved! Go to Solution.
Yes, Int64 is the limit. Embarrassingly Power BI does not support Int128.
Have you tried the logarithm approach?
The Int64 limit was the blocker for what I was trying to do, however, researching and testing all of this, including the LN approach, exposed a couple of problems in the data.
The values of the column being evaluated are the result of a DATEDIFF function, getting to a number of days. While it is not the norm, it's possible that choosing DATEDIFF on DAY results in 0, if the start datetime is in the very early morning, and the end datetime is later in the evening approaching midnight. May 7 - May 7 is 0.
To avoid operating on zeros, which causes errors, I modified the DATEDIFF to operate on MINUTE, then divided that result by 60 to get an hour span which is needed for another calculation, and then divided it further by 24 to get a day span which would give those not quite a full 24-hour span a decimal value result, eg. 20 hours of 0.833333. That solved the 0 problem.
I still got an error despite that cleanup with some outlier cases - and discovered a data quality issue where a datetime of say 2025-05-07 23:59 on one table, somehow got recorded in a second location (after systemic processing) as 2025-05-07 00:00. Note it didn't flip the day to 2025-05-08, I had a start datetime of 2025-05-07 07:41, but then an end datetime of 2025-05-07 00:00, impossible to end before you began.
A change in where that end datetime was pulled removed those outliers (along with some error checking just in case it would slip in again) led to an accurate positive non-zero result in the SQL calculation for that column.
Once all that was done, GEOMEAN() worked as intended without having to convert to log values, apparently the Int64 limit doesn't apply to it's own internal calculations, because the product did exceed Int64 in some cases.
All in all, a better outcome to be able to use the built-in function rather than recreating it, and taught me to double check my input values earlier in the troubleshooting process.
Hi @AnnieV,
Just following up to see if the solution provided was helpful in resolving your issue. Please feel free to let us know if you need any further assistance.
If the response addressed your query, kindly mark it as Accepted Solution and click Yes if you found it helpful — this will benefit others in the community as well.
Best regards,
Prasanna Kumar
Hi @AnnieV,
Thank you for reaching out to the Microsoft Fabric Forum Community.
To safely calculate the nth root of a product in Power BI without running into overflow errors, you should use a logarithmic transformation. Instead of directly computing the product of many values, which can quickly exceed Power BI's numeric limits. you sum the natural logarithms of the values and then apply the exponential function to the average log. This approach gives the same result as the nth root of the product (also known as the geometric mean) but avoids issues with large numbers. In DAX, this is implemented using SUMX over LOG, followed by EXP(LogSum / Count), ensuring stability even with large datasets.
If you find this response helpful, please consider marking it as the accepted solution and giving it a thumbs-up to support others in the community.
Thank you & regards,
Prasanna Kumar
Cannot reproduce. Something else must be at play.
So I made an Excel with 2 columns ID and VALUE, ID running from 1 to 10,000 and VALUE filled by a random number using =RANDBETWEEN(1,1000)/100
Used that as the data source, making sure that VALUE was typed as a decimal.
Measure:
Hello @AnnieV,
Can you please try the following:
GLOS =
EXP (
AVERAGEX (
FILTER (
CAC_AllData,
NOT(ISBLANK(CAC_AllData[YourNumericColumn])) && CAC_AllData[YourNumericColumn] > 0
),
LN(CAC_AllData[YourNumericColumn])
)
)
I'm not sure I understand what your formula is doing. I need to calculate the product of all values in the column, then take the nth root of that product, where n is the number of rows in the column. They are all greater than 0 and not blank.
RANDBETWEEN(1,1000)/100
That can result in numbers greater than 1, worst case for all 10000 IDs. That can easily exceed the Int64 limits of Power BI when you multiply them. Change your calculation so that they all are below 1 (or use logarithms).
Many to most of the values will be over 1, as this is a calculation using a measure of time in days (datediff on hours, divided by 24 to get a decimal day value), with typical values between 1 and 3, but sometimes have much longer outliers.
Is Int64 the largest number it can deal with? I can't pre-calculate in a practical view in a SQL query, as the aggregation is dynamic (individual bucket, group of buckets, all my buckets).
I've thought of running an R script and pulling it back, storing as text if needed, but I'm not the Power BI admin, and that will be a slog that I need at least a temporary work around for.
Yes, Int64 is the limit. Embarrassingly Power BI does not support Int128.
Have you tried the logarithm approach?