Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
Bmejia
Super User
Super User

Identify duplicate values with different values

I am trying to identify if a employeed submitted the same expense amount in two different orders or in the same order.  I have builded this measure column to filter down in my visual if the same amount is in in the same order.  I tried doing a combo of both but it didn't work, and I understand that it has to be either one, so now I am trying to split it into two filters to make it much simpler, but I am having issues identifying (Filter1) which is looking for the same amount in a different orders.

--If Greater than 1 then "Same Amount in the Same Order#

New =
COUNTROWS(
    FILTER(Table1,
        Table1[orders]=EARLIER(Table1[orders])
            && EARLIER(Table1[Amount])=Table1[Amount]))
--using this to identify all duplicates
Duplicates =
IF (
    CALCULATE (
        COUNTROWS ( Table1),
        ALLEXCEPT ( Table1, Table1[Employee], Table1[Amount], Table1[Description])
    ) = 1,
    0,
    1
 
EmployeeDecriptionOrdersTRANSDATEAmountMy Filter Part1My Filter part 2(for visual)Filter1( I would like to see)filter2(  2nd Filter I would like to see)
JohnMilesOrd-U0885185/24/20221.122Same Amount in the Same Order#Same Amount in Different Order#Same Amount in the Same Order#
JohnMilesOrd-U0885185/25/20221.122Same Amount in the Same Order#Same Amount in Different Order#Same Amount in the Same Order#
JohnMilesOrd-U0886606/15/20221.121Same Amount in Different Order#Same Amount in Different Order# 
JohnAccidentOrd-U0884647/11/202252Same Amount in the Same Order#Same Amount in Different Order#Same Amount in the Same Order#
JohnAccidentOrd-U0884647/18/202252Same Amount in the Same Order#Same Amount in Different Order#Same Amount in the Same Order#
JohnAccidentOrd-U0887657/22/202251Same Amount in Different Order#Same Amount in Different Order# 
JohnAccidentOrd-U0885186/6/2022201Same Amount in Different Order#  
JohnAccidentOrd-U0886606/14/2022201Same Amount in Different Order#  
JohnLunchOrd-U0884647/11/2022462Same Amount in the Same Order#Same Amount in Different Order#Same Amount in the Same Order#
JohnLunchOrd-U0884647/18/2022462Same Amount in the Same Order#Same Amount in Different Order#Same Amount in the Same Order#
JohnLunchOrd-U0886657/22/2022461Same Amount in Different Order#Same Amount in Different Order# 
JohnMilesOrd-U0885186/1/202248.162Same Amount in the Same Order# Same Amount in the Same Order#
JohnMilesOrd-U0885186/3/202248.162Same Amount in the Same Order# Same Amount in the Same Order#
JohnMilesOrd-U0885185/23/202296.882Same Amount in the Same Order# Same Amount in the Same Order#
JohnMilesOrd-U0885185/26/202296.882Same Amount in the Same Order# Same Amount in the Same Order#
JohnMilesOrd-U0884647/11/2022117.62Same Amount in the Same Order# Same Amount in the Same Order#
JohnMilesOrd-U0884647/18/2022117.62Same Amount in the Same Order# Same Amount in the Same Order#
JohnDinnerOrd-U0885185/23/20221362Same Amount in the Same Order# Same Amount in the Same Order#
JohnDinnerOrd-U0885186/6/20221362Same Amount in the Same Order# Same Amount in the Same Order#
JohnHotelOrd-U0885185/24/2022155.826Same Amount in the Same Order# Same Amount in the Same Order#
JohnHotelOrd-U0885185/25/2022155.823Same Amount in the Same Order# Same Amount in the Same Order#
1 ACCEPTED SOLUTION
HoangHugo
Solution Specialist
Solution Specialist

Hi,

I understand, in filter 1, you look up new orders have same amount, so it should be:

New =
COUNTROWS(
    FILTER(Table1,
        Table1[orders]<>EARLIER(Table1[orders])
            && EARLIER(Table1[Amount])=Table1[Amount]))

View solution in original post

5 REPLIES 5
Bmejia
Super User
Super User

HoangHugo solution works, but once i implemented with production data which is alot more data, it is taking a long time to load.  I had to stopped.   is there a more efficient way to get this resolve where it does not take a toll on the data load performance.

I summarized the data and filter it down the data to only to what need it to be scan unfortunately still having same issue.  I am down to around 165,000 and still causing the issue.  I do have other three columns where i am using the earlier function but using "=" rather "<>"  and they ran find once i removed the column.

TABLE2 =
CALCULATETABLE(
 SUMMARIZE(TABLE1,TravelExpenses[Amount],TABLE1[Order],TABLE1[Description],Table1[Employee]),TABLE1[Duplicates]=1)

Thanks any feedback would be appreciated.
HoangHugo
Solution Specialist
Solution Specialist

Sure, Microsoft advices EARLIER may take long time. In your case, scanning entire table is required, but I suggest you use SUMMARIZE make a new table to reduce rows that system have to scan

Bmejia
Super User
Super User

Thanks Hoang, I just added another earlier statatement to breakout the data by all the employees in my table as my example only included one employee.

New =
COUNTROWS(
    FILTER(Table1,
        Table1[orders]<>EARLIER(Table1[orders])
            && EARLIER(Table1[Amount])=Table1[Amount])
            && EARLIER(Table1[Employee])=Table1[Employee]))

HoangHugo
Solution Specialist
Solution Specialist

Hi,

I understand, in filter 1, you look up new orders have same amount, so it should be:

New =
COUNTROWS(
    FILTER(Table1,
        Table1[orders]<>EARLIER(Table1[orders])
            && EARLIER(Table1[Amount])=Table1[Amount]))

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.