The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hello All,
I have a requirement to find all the 10 minute time intervals with more than 2 records, from a dataset.
My dataset looks like,
EventTime
01/01/2020 00:01:00
01/01/2020 00:02:00
01/01/2020 00:03:00
01/01/2020 00:12:00
01/01/2020 00:13:00
01/01/2020 00:14:00
01/01/2020 00:15:00
01/01/2020 00:23:00
01/01/2020 00:24:00
01/01/2020 00:48:00
01/01/2020 00:49:00
01/01/2020 00:58:00
I want output like
EventTime Count
01/01/2020 00:01:00 3
01/01/2020 00:02:00 2
01/01/2020 00:03:00 2
01/01/2020 00:12:00 4
01/01/2020 00:13:00 3
01/01/2020 00:14:00 3
01/01/2020 00:19:00 3
01/01/2020 00:23:00 2
01/01/2020 00:24:00 1
01/01/2020 00:48:00 2
01/01/2020 00:49:00 2
01/01/2020 00:58:00 1
My dataset is very large and applying any measure to identify the nth record is not working.
Any help on this issue is greatly appreciated.
Thank you.
GR
Solved! Go to Solution.
Use this as a calculated column:
CalculatedColumn =
VAR _curEventTime = Table2[EventTime]
VAR _tmpTable = FILTER(Table2, Table2[EventTime]>= _curEventTime && Table2[EventTime] < _curEventTime + 10/(24*60))
RETURN
COUNTROWS(_tmpTable)
Result:
If any questions, let me know!
Kind regards
Djerro123
-------------------------------
If this answered your question, please mark it as the Solution. This also helps others to find what they are looking for.
Keep those thumbs up coming! 🙂
Proud to be a Super User!
Use this as a calculated column:
CalculatedColumn =
VAR _curEventTime = Table2[EventTime]
VAR _tmpTable = FILTER(Table2, Table2[EventTime]>= _curEventTime && Table2[EventTime] < _curEventTime + 10/(24*60))
RETURN
COUNTROWS(_tmpTable)
Result:
If any questions, let me know!
Kind regards
Djerro123
-------------------------------
If this answered your question, please mark it as the Solution. This also helps others to find what they are looking for.
Keep those thumbs up coming! 🙂
Proud to be a Super User!
Thank you djerro123! I tried your solution, but my dataset being very large, Power BI Desktop has been processing for morethan 12 hours now, and is not yet done. Do you know of any alternatives?
Hi @Anonymous ,
When you have a massive dataset, you might want to explore other ways to enrich your data. For example, you could use Azure Synapse Analytics to store your data and a Azure Databricks workspace to analyze this on a Spark cluster. In your usecase, where the dataset consist out of millions of rows (billions?), that is highly recommendable and faster.
How big is the dataset anyway? (in terms of Rows and GB?)
Proud to be a Super User!
Hello @JarroVGIT
My Dataset consist of 3 million records. I will check out the options you mentioned.
Thanks for your consistent support to the Power BI community!
User | Count |
---|---|
15 | |
12 | |
8 | |
7 | |
7 |
User | Count |
---|---|
24 | |
20 | |
12 | |
10 | |
7 |