The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi all, hoping someone can help please!
Can you suggest how to caluclate the average time (will be mostly in seconds) between customer activity per Store, however I need to also be able to analyse it by Area if required.
I would like to do this entirely in measures as opposed to caculated columns due to data volume.
(Customer Id is probably not required)
Many thanks!
Please provide sample data (with sensitive information removed) that covers your issue or question completely, in a usable format (not as a screenshot). Leave out anything not related to the issue.
https://community.fabric.microsoft.com/t5/Community-Blog/How-to-provide-sample-data-in-the-Power-BI-...
Please show the expected outcome based on the sample data you provided.
https://community.fabric.microsoft.com/t5/Desktop/How-to-Get-Your-Question-Answered-Quickly/m-p/1447...
Hi, thanks for the tips, the data was sample so not sensitive but I've removed customerID as probably not needed anyhow.
Many thanks!
Sample data
Area Code | Store_ID | Activity TimesStamp |
Area 1 | Store 1 | 01/07/2022 03:05:04 |
Area 1 | Store 1 | 01/07/2022 03:05:08 |
Area 1 | Store 1 | 01/07/2022 03:05:13 |
Area 1 | Store 1 | 01/07/2022 03:07:14 |
Area 1 | Store 1 | 01/07/2022 03:09:47 |
Area 1 | Store 1 | 01/07/2022 03:14:50 |
Area 1 | Store 1 | 01/07/2022 03:15:13 |
Area 1 | Store 1 | 01/07/2022 03:16:06 |
Area 1 | Store 2 | 01/07/2022 03:18:18 |
Area 1 | Store 2 | 01/07/2022 03:18:50 |
Area 1 | Store 2 | 01/07/2022 03:20:03 |
Area 1 | Store 2 | 01/07/2022 03:20:57 |
Area 1 | Store 2 | 01/07/2022 03:27:39 |
Area 1 | Store 2 | 01/07/2022 03:28:11 |
Area 1 | Store 2 | 01/07/2022 03:28:27 |
Area 2 | Store 3 | 01/07/2022 03:29:21 |
Area 2 | Store 3 | 01/07/2022 03:33:24 |
Area 2 | Store 3 | 01/07/2022 04:12:31 |
Area 2 | Store 3 | 01/07/2022 04:18:06 |
Area 2 | Store 3 | 01/07/2022 04:20:41 |
Area 2 | Store 3 | 01/07/2022 04:21:44 |
Area 2 | Store 3 | 01/07/2022 04:24:05 |
Area 2 | Store 3 | 01/07/2022 04:25:59 |
Area 2 | Store 4 | 01/07/2022 04:28:06 |
Area 2 | Store 4 | 01/07/2022 04:29:08 |
Area 2 | Store 4 | 01/07/2022 04:31:14 |
Area 2 | Store 4 | 01/07/2022 04:33:10 |
Area 2 | Store 4 | 01/07/2022 04:33:26 |
Area 2 | Store 4 | 01/07/2022 04:34:11 |
Area 2 | Store 4 | 01/07/2022 04:36:34 |
Expected output
Expected output | |
Average time between events | |
Store 1 | 95 |
Store2 | 101 |
Store 3 | 485 |
Store 4 | 85 |
Average time between events | |
Area 1 | 100 |
Area 2 | 288 |
I think you may want to review your expected result by area.
Avg time between events =
VAR a =
SUMMARIZE (
'Table',
[Area Code],
'Table'[Store_ID],
[Activity TimesStamp],
"Prev",
VAR a = [Area Code]
VAR s = [Store_ID]
VAR t = [Activity TimesStamp]
RETURN
CALCULATE (
MAX ( 'Table'[Activity TimesStamp] ),
'Table'[Area Code] = a,
'Table'[Store_ID] = s,
'Table'[Activity TimesStamp] < t
)
)
VAR b =
ADDCOLUMNS (
a,
"diff",
IF ( ISBLANK ( [Prev] ), 0, ( [Activity TimesStamp] - [prev] ) * 86400 )
)
VAR c =
FILTER ( b, [diff] > 0 )
RETURN
AVERAGEX ( c, [diff] )
Hi, many thanks for this!
I've tried this is my dataset of 1m rows and its maxing resources, any suggestions please?
Thanks!
You might have wanted to mention that earlier. In that case I would suggest you create an index column and use that instead of the timestamp filter.
User | Count |
---|---|
16 | |
8 | |
7 | |
6 | |
6 |
User | Count |
---|---|
26 | |
13 | |
12 | |
8 | |
8 |