March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
I would like to understand if and how I could perform a "bitwise or aggregation" on a column, to calculate the bitwise or for all values in a specific column, to be included in a DAX measure.
Simplified dataset to explain
The (simulated and drastically simplified) source data I have is daily data which consists of the following dimensional attributes
and has many numerical metrics, in this example represented by value_1.
In PowerBI I have many measures defined to which sum etc. on the value_xx columns which I can group / display across many of the dimensions.
The problem
Due to load peformance I want to change the PowerBI dashboard to monthly aggregates, vs. daily data which in 99% of the cases is looked at on a monthly basis only. I can do a lot of aggregations to monthly basis in the SQL load scripts, because the majority of my measures is summing anyhow.
However one of the key measures I have is the distinct count of days per month, which nicely calculates in the PowerBI report across all the different dimensions, filters, etc. because I have the individual days as my finest granularity. When I move up to monthly granularity, I will lose the ability to run these distinct days per month counts.
Prepare data to monthly aggregate in my source system
To overcome this challenge I thought of generating a bit string of length 31 (representing day 1 - 31 in the month) as part of my datapreparations (simple pseudo formula: string(10^(day of month -1))
So:
an example can be found in the 'day_of_month_bit_position' column in the sample data set below.
since this is a string, I can also calculate the decimal reprenstation for this based on bin2dec conversions in my source system.
that's what is found in day_of_month_bit_as_decimal.
In my source I can also generate the bitwise or aggregates when grouping into my month aggregates, which practically means that for each position in the string the maximum value is taken across all observations I aggregate over. So the monthly aggregate will have 1s on the bits representing the days we had data for.
Examle (with week bit string representing mo-sun, because 31 is a lot of 1s and zeroes in an example):
the bitwise or of this results in 1010111 (5 unique days, mon, wed, fri, sat, sun)
How to do this same aggregation in PowerBI using a measure?
I want to be able to do this same 'bitwise or aggregation' in PowerBI to count the unique days in my current context, to do so I thought of the following steps
Struggling to get it working
I have been banging my head against the wall now for some hours to get anywhere with this. Constantly ending up in dead ends 😞 .
To summarize (in the example below):
Note: I know this is still daily data, I on purpose provided this example instead of the monthly_pre_aggregates which I can create from my source, because validation of the calculation is still possible, since day_of_month is present.
@GeoffreyS Can you post that data as text? There are a number of DAX bit operations that you can find here, including BITOR:
A-B Excel to DAX Translation - Microsoft Fabric Community
copy paste from my other message below allows for reading in as tsv.
for completeness here is .csv format as well.
With regards to the bitwise functions available, they are meant for comparing two values bitwise, not so much to apply them in an aggregation ( i think 😉 ).
If there is a way to do something like this pseudo code example in DAX it could be an option:
running_bitwise_or_agg = 0
for each value val_x in column x from all rows in table in context:
running_bitwise_or_agg = BITOR(val_x, running_bitwise_or_agg)
csv
day_of_month,month,customer,category,product_group,value_1,day_of_month_bit_position,day_of_month_bit_as_decimal, ,day_of_month_bit_as_csv
24,2,customer_1,cat a,prod_group_1,62,0000000100000000000000000000000,8388608, ,"0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
27,2,customer_1,cat a,prod_group_2,67,0000100000000000000000000000000,67108864, ,"0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
18,2,customer_1,cat b,prod_group_3,20,0000000000000100000000000000000,131072, ,"0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
29,3,customer_1,cat c,prod_group_4,57,0010000000000000000000000000000,268435456, ,"0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
21,3,customer_1,cat a,prod_group_5,59,0000000000100000000000000000000,1048576, ,"0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
5,3,customer_1,cat b,prod_group_6,78,0000000000000000000000000010000,16, ,"0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0"
29,1,customer_2,cat b,prod_group_7,20,0010000000000000000000000000000,268435456, ,"0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
17,4,customer_2,cat c,prod_group_8,53,0000000000000010000000000000000,65536, ,"0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
10,5,customer_2,cat b,prod_group_6,46,0000000000000000000001000000000,512, ,"0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0"
10,1,customer_3,cat a,prod_group_5,81,0000000000000000000001000000000,512, ,"0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0"
17,1,customer_3,cat b,prod_group_3,22,0000000000000010000000000000000,65536, ,"0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
1,1,customer_3,cat c,prod_group_4,63,0000000000000000000000000000001,1, ,"0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1"
27,3,customer_3,cat c,prod_group_13,77,0000100000000000000000000000000,67108864, ,"0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
30,3,customer_3,cat a,prod_group_5,41,0100000000000000000000000000000,536870912, ,"0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0"
@GeoffreyS @GeoffreysS Hey, how about that, we invented something new! The XBITOR! See attached PBIX file below signature.
XBITOR =
VAR __RowCount = COUNTROWS('bitor')
VAR __Length = LEN(MAX('bitor'[day_of_month_bit_position]))
VAR __String = CONCATENATEX('bitor', [day_of_month_bit_position])
VAR __Table =
ADDCOLUMNS(
GENERATESERIES(0, __RowCount * (__Length - 1)),
"__Bit", MID(__String, [Value]+1, 1) + 0,
"__Mod", MOD([Value], __Length + 1)
)
VAR __Table1 =
ADDCOLUMNS(
GENERATESERIES(0, __Length),
"__BITOR",
VAR __Index = [Value]
VAR __Sum = SUMX(FILTER(__Table, [__Mod] = __Index),[__Bit])
VAR __Result = IF(__Sum = 0, 0, 1)
RETURN
__Result
)
RETURN
CONCATENATEX(__Table1, [__BITOR],,[Value],ASC)
Thanks a lot... that was fast 😉
I am gonna check tomorrow... but it looks promising 😉
I might be over asking here, but is there a fast way to get the sum of all the ones back as well? (that's the actual unique day count)
@GeoffreysS Found a bug, fixed in these versions:
XBITOR =
VAR __RowCount = COUNTROWS('bitor')
VAR __Length = LEN(MAX('bitor'[day_of_month_bit_position]))
VAR __String = CONCATENATEX('bitor', [day_of_month_bit_position])
VAR __Table =
ADDCOLUMNS(
GENERATESERIES(0, __RowCount * __Length - 1),
"__Bit", MID(__String, [Value]+1, 1) + 0,
"__Mod", MOD([Value], __Length)
)
VAR __Table1 =
ADDCOLUMNS(
GENERATESERIES(0, __Length - 1),
"__BITOR",
VAR __Index = [Value]
VAR __Sum = SUMX(FILTER(__Table, [__Mod] = __Index),[__Bit])
VAR __Result = IF(__Sum = 0, 0, 1)
RETURN
__Result
)
RETURN
CONCATENATEX(__Table1, [__BITOR],,[Value],ASC)
XBITOR_DEC =
VAR __RowCount = COUNTROWS('bitor')
VAR __Length = LEN(MAX('bitor'[day_of_month_bit_position]))
VAR __String = CONCATENATEX('bitor', [day_of_month_bit_position])
VAR __Table =
ADDCOLUMNS(
GENERATESERIES(0, __RowCount * (__Length - 1)),
"__Bit", MID(__String, [Value]+1, 1) + 0,
"__Mod", MOD([Value], __Length)
)
VAR __Table1 =
ADDCOLUMNS(
GENERATESERIES(0, __Length),
"__BITOR",
VAR __Index = [Value]
VAR __Sum = SUMX(FILTER(__Table, [__Mod] = __Index),[__Bit])
VAR __Result = IF(__Sum = 0, 0, 1)
RETURN
__Result
)
RETURN
COUNTROWS(FILTER(__Table1, [__BITOR] = 1))
@GeoffreysS So, if I understand correctly, you want this?
XBITOR_DEC =
VAR __RowCount = COUNTROWS('bitor')
VAR __Length = LEN(MAX('bitor'[day_of_month_bit_position]))
VAR __String = CONCATENATEX('bitor', [day_of_month_bit_position])
VAR __Table =
ADDCOLUMNS(
GENERATESERIES(0, __RowCount * (__Length - 1)),
"__Bit", MID(__String, [Value]+1, 1) + 0,
"__Mod", MOD([Value], __Length + 1)
)
VAR __Table1 =
ADDCOLUMNS(
GENERATESERIES(0, __Length),
"__BITOR",
VAR __Index = [Value]
VAR __Sum = SUMX(FILTER(__Table, [__Mod] = __Index),[__Bit])
VAR __Result = IF(__Sum = 0, 0, 1)
RETURN
__Result
)
RETURN
COUNTROWS(FILTER(__Table1, [__BITOR] = 1))
thanks... still have not found the time to try it out.. but will in the upcomign days and let you know..
thanks a million again.
Sample dataset here:
day_of_month | month | customer | category | product_group | value_1 | day_of_month_bit_position | day_of_month_bit_as_decimal | day_of_month_bit_as_csv | |
24 | 2 | customer_1 | cat a | prod_group_1 | 62 | 0000000100000000000000000000000 | 8388608 | 0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
27 | 2 | customer_1 | cat a | prod_group_2 | 67 | 0000100000000000000000000000000 | 67108864 | 0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
18 | 2 | customer_1 | cat b | prod_group_3 | 20 | 0000000000000100000000000000000 | 131072 | 0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
29 | 3 | customer_1 | cat c | prod_group_4 | 57 | 0010000000000000000000000000000 | 268435456 | 0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
21 | 3 | customer_1 | cat a | prod_group_5 | 59 | 0000000000100000000000000000000 | 1048576 | 0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
5 | 3 | customer_1 | cat b | prod_group_6 | 78 | 0000000000000000000000000010000 | 16 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0 | |
29 | 1 | customer_2 | cat b | prod_group_7 | 20 | 0010000000000000000000000000000 | 268435456 | 0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
17 | 4 | customer_2 | cat c | prod_group_8 | 53 | 0000000000000010000000000000000 | 65536 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
10 | 5 | customer_2 | cat b | prod_group_6 | 46 | 0000000000000000000001000000000 | 512 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0 | |
10 | 1 | customer_3 | cat a | prod_group_5 | 81 | 0000000000000000000001000000000 | 512 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0 | |
17 | 1 | customer_3 | cat b | prod_group_3 | 22 | 0000000000000010000000000000000 | 65536 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
1 | 1 | customer_3 | cat c | prod_group_4 | 63 | 0000000000000000000000000000001 | 1 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1 | |
27 | 3 | customer_3 | cat c | prod_group_13 | 77 | 0000100000000000000000000000000 | 67108864 | 0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 | |
30 | 3 | customer_3 | cat a | prod_group_5 | 41 | 0100000000000000000000000000000 | 536870912 | 0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 |
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
23 | |
19 | |
16 | |
9 | |
5 |
User | Count |
---|---|
36 | |
29 | |
16 | |
15 | |
12 |