Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
Freeseman
Helper II
Helper II

Total Divided by Unique ID count

Hi All,

 

Do i need to write this is SQL or is it possible as a meassure in Power Bi.

 

I need to take the total payload result being the Last record of weight type B 98.5 and devide it by the count of the buckets above it (12) to get my average weight = 8.20  (weight type T) is basically the end reult of the final B value so we need to try ignore T and just divide by the group of B for the unique haul cycle rec ident on the right.

Freeseman_2-1662528738291.png

 

You will see the REC ident is 39587 for this cycle. I would need to take the last value of weight B 98.5 and divide by the count of haul cycle rec ident but unique to this cycle if you know what i mean.

How can i acheive this?

 

1 ACCEPTED SOLUTION
v-yangliu-msft
Community Support
Community Support

Hi  @Freeseman ,

Here are the steps you can follow:

1. Create measure.

Measure =
var _maxdate=
MAXX(FILTER(ALL('Table'),'Table'[HAUL_CYCLE_REC_IDENT]=MAX('Table'[HAUL_CYCLE_REC_IDENT])&&'Table'[Group]<>"T")
,[TIMESTAMP])
var_sum=
SUMX(FILTER(ALL('Table'),'Table'[HAUL_CYCLE_REC_IDENT]=MAX('Table'[HAUL_CYCLE_REC_IDENT])&&'Table'[TIMESTAMP]=_maxdate),[PAYLOAD])
var _count=
COUNTX(FILTER(ALL('Table'),
'Table'[HAUL_CYCLE_REC_IDENT]=MAX('Table'[HAUL_CYCLE_REC_IDENT])&&'Table'[TIMESTAMP]<_maxdate),[Group])
return
DIVIDE(
    _sum,_count)

2. Result:

vyangliumsft_0-1662625664643.png

 

Best Regards,

Liu Yang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

View solution in original post

2 REPLIES 2
v-yangliu-msft
Community Support
Community Support

Hi  @Freeseman ,

Here are the steps you can follow:

1. Create measure.

Measure =
var _maxdate=
MAXX(FILTER(ALL('Table'),'Table'[HAUL_CYCLE_REC_IDENT]=MAX('Table'[HAUL_CYCLE_REC_IDENT])&&'Table'[Group]<>"T")
,[TIMESTAMP])
var_sum=
SUMX(FILTER(ALL('Table'),'Table'[HAUL_CYCLE_REC_IDENT]=MAX('Table'[HAUL_CYCLE_REC_IDENT])&&'Table'[TIMESTAMP]=_maxdate),[PAYLOAD])
var _count=
COUNTX(FILTER(ALL('Table'),
'Table'[HAUL_CYCLE_REC_IDENT]=MAX('Table'[HAUL_CYCLE_REC_IDENT])&&'Table'[TIMESTAMP]<_maxdate),[Group])
return
DIVIDE(
    _sum,_count)

2. Result:

vyangliumsft_0-1662625664643.png

 

Best Regards,

Liu Yang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

davehus
Memorable Member
Memorable Member

Hi @Freeseman ,

Try the code below to see if it works for you.

Max Value =
VAR vMaxTime =
    CALCULATE ( MAX ( 'Table'[DateTime] ), FILTER ( 'Table', 'Table'[B/T] <> "T" ) ) //What is the max time not related to T
VAR vRecordCount =
    CALCULATE (
        COUNTROWS ( 'Table' ),
        ALLEXCEPT ( 'Table', 'Table'[Ident] ),
        FILTER ( 'Table', 'Table'[B/T] <> "T" )
    ) //What is the record count
VAR vMaxValue =
    CALCULATE (
        MAX ( 'Table'[Payload] ),
        ALLEXCEPT ( 'Table', 'Table'[Ident] ),
        FILTER ( 'Table', 'Table'[DateTime] = vMaxTime && 'Table'[B/T] <> "T" )
    ) //What is the maxvalue
RETURN
    DIVIDE ( vMaxValue, vRecordCount )

Did I help you today? Please accept my solution and hit the Kudos button.

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.