Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Apologies if this has been posted before, as I do see a lot of consecutive date topics here.
I have groundwater well data that has been assigned a 1 or a 0, depending on if certain parameters are met ([Total Days with 3 Exceed Ref] on a give day ([DateOnly]), and while I can get the total days that received a 1, I need to know a few other measurements.
Q1) How many times did the groundwater well exceed the value over time?
Q2) How long did those times last?
Q3) The longest of those times?
In the example below, a well exceed a reference level 264 times total. One of those times was between 5/21/2024 and 6/16/2024. However, on 6/17/2024, the well level no longer met criteria. In this example, it would have met the criteria for 27 days. It exceeded again on 6/24/2024 and this lasted until 10/24/2024, so it met criteria again for 119 days. This occurs again in the timeframe as follows in the table. The ideal output would be a table like below, with the start date and end date, the total days that were consecutive, and how often that occurred. Example data is pasted at the bottom should you want to work with it.
START | END | COUNT OF DAYS | INSTANCE |
5/21/2024 | 6/16/2024 | 27 | 1 |
6/24/2024 | 10/24/2024 | 119 | 1 |
11/8/2024 | 12/4/2024 | 27 | 1 |
12/30/2024 | 1/2/2025 | 4 | 1 |
1/2/2025 | 2/9/2025 | 16 | 1 |
2/14/2025 | 3/1/2025 | 16 | 1 |
3/6/2025 | 3/6/2025 | 1 | 1 |
3/10/2025 | 4/14/2025 | 36 | 1 |
5/11/2025 | 5/28/2025 | 18 | 1 |
264 | 9 |
RAW DATA:
DateOnly | Total Days With 3 Exceed Ref |
5/20/2024 0:00 | 0 |
5/21/2024 0:00 | 1 |
5/22/2024 0:00 | 1 |
5/23/2024 0:00 | 1 |
5/24/2024 0:00 | 1 |
5/25/2024 0:00 | 1 |
5/26/2024 0:00 | 1 |
5/27/2024 0:00 | 1 |
5/28/2024 0:00 | 1 |
5/29/2024 0:00 | 1 |
5/30/2024 0:00 | 1 |
5/31/2024 0:00 | 1 |
6/1/2024 0:00 | 1 |
6/2/2024 0:00 | 1 |
6/3/2024 0:00 | 1 |
6/4/2024 0:00 | 1 |
6/5/2024 0:00 | 1 |
6/6/2024 0:00 | 1 |
6/7/2024 0:00 | 1 |
6/8/2024 0:00 | 1 |
6/9/2024 0:00 | 1 |
6/10/2024 0:00 | 1 |
6/11/2024 0:00 | 1 |
6/12/2024 0:00 | 1 |
6/13/2024 0:00 | 1 |
6/14/2024 0:00 | 1 |
6/15/2024 0:00 | 1 |
6/16/2024 0:00 | 1 |
6/17/2024 0:00 | 0 |
6/18/2024 0:00 | 0 |
6/19/2024 0:00 | 0 |
6/20/2024 0:00 | 0 |
6/21/2024 0:00 | 0 |
6/22/2024 0:00 | 0 |
6/23/2024 0:00 | 0 |
6/24/2024 0:00 | 1 |
6/25/2024 0:00 | 1 |
6/26/2024 0:00 | 1 |
6/27/2024 0:00 | 1 |
6/28/2024 0:00 | 1 |
6/29/2024 0:00 | 1 |
6/30/2024 0:00 | 1 |
7/1/2024 0:00 | 1 |
7/2/2024 0:00 | 1 |
7/3/2024 0:00 | 1 |
7/4/2024 0:00 | 1 |
7/5/2024 0:00 | 1 |
7/6/2024 0:00 | 1 |
7/7/2024 0:00 | 1 |
7/8/2024 0:00 | 1 |
7/9/2024 0:00 | 1 |
7/10/2024 0:00 | 1 |
7/11/2024 0:00 | 1 |
7/12/2024 0:00 | 1 |
7/13/2024 0:00 | 1 |
7/14/2024 0:00 | 1 |
7/15/2024 0:00 | 1 |
7/16/2024 0:00 | 1 |
7/17/2024 0:00 | 1 |
7/18/2024 0:00 | 1 |
7/19/2024 0:00 | 1 |
7/20/2024 0:00 | 1 |
7/21/2024 0:00 | 1 |
7/22/2024 0:00 | 1 |
7/23/2024 0:00 | 1 |
7/24/2024 0:00 | 1 |
7/25/2024 0:00 | 1 |
7/26/2024 0:00 | 1 |
7/27/2024 0:00 | 1 |
7/28/2024 0:00 | 1 |
7/29/2024 0:00 | 1 |
7/30/2024 0:00 | 1 |
7/31/2024 0:00 | 1 |
8/1/2024 0:00 | 1 |
8/2/2024 0:00 | 1 |
8/3/2024 0:00 | 1 |
8/4/2024 0:00 | 1 |
8/5/2024 0:00 | 1 |
8/6/2024 0:00 | 1 |
8/7/2024 0:00 | 1 |
8/8/2024 0:00 | 1 |
8/9/2024 0:00 | 1 |
8/10/2024 0:00 | 1 |
8/11/2024 0:00 | 1 |
8/12/2024 0:00 | 1 |
8/13/2024 0:00 | 1 |
8/14/2024 0:00 | 1 |
8/15/2024 0:00 | 1 |
8/16/2024 0:00 | 1 |
8/17/2024 0:00 | 1 |
8/18/2024 0:00 | 1 |
8/19/2024 0:00 | 1 |
8/20/2024 0:00 | 1 |
8/21/2024 0:00 | 1 |
8/22/2024 0:00 | 1 |
8/23/2024 0:00 | 1 |
8/24/2024 0:00 | 1 |
8/25/2024 0:00 | 1 |
8/26/2024 0:00 | 1 |
8/27/2024 0:00 | 1 |
8/28/2024 0:00 | 1 |
8/29/2024 0:00 | 1 |
8/30/2024 0:00 | 1 |
8/31/2024 0:00 | 1 |
9/1/2024 0:00 | 1 |
9/2/2024 0:00 | 1 |
9/3/2024 0:00 | 1 |
9/4/2024 0:00 | 1 |
9/5/2024 0:00 | 1 |
9/6/2024 0:00 | 1 |
9/7/2024 0:00 | 1 |
9/8/2024 0:00 | 1 |
9/9/2024 0:00 | 1 |
9/10/2024 0:00 | 1 |
9/11/2024 0:00 | 1 |
9/12/2024 0:00 | 1 |
9/13/2024 0:00 | 1 |
9/14/2024 0:00 | 1 |
9/15/2024 0:00 | 1 |
9/16/2024 0:00 | 1 |
9/17/2024 0:00 | 1 |
9/18/2024 0:00 | 1 |
9/19/2024 0:00 | 1 |
9/20/2024 0:00 | 1 |
9/21/2024 0:00 | 1 |
9/22/2024 0:00 | 1 |
9/23/2024 0:00 | 1 |
9/24/2024 0:00 | 1 |
9/25/2024 0:00 | 1 |
9/26/2024 0:00 | 1 |
9/27/2024 0:00 | 1 |
9/28/2024 0:00 | 1 |
9/29/2024 0:00 | 1 |
9/30/2024 0:00 | 1 |
10/1/2024 0:00 | 1 |
10/2/2024 0:00 | 1 |
10/3/2024 0:00 | 1 |
10/4/2024 0:00 | 1 |
10/5/2024 0:00 | 1 |
10/6/2024 0:00 | 1 |
10/7/2024 0:00 | 1 |
10/8/2024 0:00 | 1 |
10/9/2024 0:00 | 1 |
10/10/2024 0:00 | 1 |
10/11/2024 0:00 | 1 |
10/12/2024 0:00 | 1 |
10/13/2024 0:00 | 1 |
10/14/2024 0:00 | 1 |
10/15/2024 0:00 | 1 |
10/16/2024 0:00 | 1 |
10/17/2024 0:00 | 1 |
10/18/2024 0:00 | 1 |
10/19/2024 0:00 | 1 |
10/20/2024 0:00 | 1 |
10/21/2024 0:00 | 0 |
10/22/2024 0:00 | 0 |
10/23/2024 0:00 | 0 |
10/24/2024 0:00 | 0 |
10/25/2024 0:00 | 0 |
10/26/2024 0:00 | 0 |
10/27/2024 0:00 | 0 |
10/28/2024 0:00 | 0 |
10/29/2024 0:00 | 0 |
10/30/2024 0:00 | 0 |
10/31/2024 0:00 | 0 |
11/1/2024 0:00 | 0 |
11/2/2024 0:00 | 0 |
11/3/2024 0:00 | 0 |
11/4/2024 0:00 | 0 |
11/5/2024 0:00 | 0 |
11/6/2024 0:00 | 0 |
11/7/2024 0:00 | 0 |
11/8/2024 0:00 | 1 |
11/9/2024 0:00 | 1 |
11/10/2024 0:00 | 1 |
11/11/2024 0:00 | 1 |
11/12/2024 0:00 | 1 |
11/13/2024 0:00 | 1 |
11/14/2024 0:00 | 1 |
11/15/2024 0:00 | 1 |
11/16/2024 0:00 | 1 |
11/17/2024 0:00 | 1 |
11/18/2024 0:00 | 1 |
11/19/2024 0:00 | 1 |
11/20/2024 0:00 | 1 |
11/21/2024 0:00 | 1 |
11/22/2024 0:00 | 1 |
11/23/2024 0:00 | 1 |
11/24/2024 0:00 | 1 |
11/25/2024 0:00 | 1 |
11/26/2024 0:00 | 1 |
11/27/2024 0:00 | 1 |
11/28/2024 0:00 | 1 |
11/29/2024 0:00 | 1 |
11/30/2024 0:00 | 1 |
12/1/2024 0:00 | 1 |
12/2/2024 0:00 | 1 |
12/3/2024 0:00 | 1 |
12/4/2024 0:00 | 1 |
12/5/2024 0:00 | 0 |
12/6/2024 0:00 | 0 |
12/7/2024 0:00 | 0 |
12/8/2024 0:00 | 0 |
12/9/2024 0:00 | 0 |
12/10/2024 0:00 | 0 |
12/11/2024 0:00 | 0 |
12/12/2024 0:00 | 0 |
12/13/2024 0:00 | 0 |
12/14/2024 0:00 | 0 |
12/15/2024 0:00 | 0 |
12/16/2024 0:00 | 0 |
12/17/2024 0:00 | 0 |
12/18/2024 0:00 | 0 |
12/19/2024 0:00 | 0 |
12/20/2024 0:00 | 0 |
12/21/2024 0:00 | 0 |
12/22/2024 0:00 | 0 |
12/23/2024 0:00 | 0 |
12/24/2024 0:00 | 0 |
12/25/2024 0:00 | 0 |
12/26/2024 0:00 | 0 |
12/27/2024 0:00 | 0 |
12/28/2024 0:00 | 0 |
12/29/2024 0:00 | 0 |
12/30/2024 0:00 | 1 |
12/31/2024 0:00 | 1 |
1/1/2025 0:00 | 1 |
1/2/2025 0:00 | 1 |
1/3/2025 0:00 | 0 |
1/4/2025 0:00 | 0 |
1/5/2025 0:00 | 0 |
1/6/2025 0:00 | 0 |
1/7/2025 0:00 | 0 |
1/8/2025 0:00 | 0 |
1/9/2025 0:00 | 0 |
1/10/2025 0:00 | 0 |
1/11/2025 0:00 | 0 |
1/12/2025 0:00 | 0 |
1/13/2025 0:00 | 0 |
1/14/2025 0:00 | 0 |
1/15/2025 0:00 | 0 |
1/16/2025 0:00 | 0 |
1/17/2025 0:00 | 0 |
1/18/2025 0:00 | 0 |
1/19/2025 0:00 | 0 |
1/20/2025 0:00 | 0 |
1/21/2025 0:00 | 0 |
1/22/2025 0:00 | 0 |
1/23/2025 0:00 | 0 |
1/24/2025 0:00 | 0 |
1/25/2025 0:00 | 1 |
1/26/2025 0:00 | 1 |
1/27/2025 0:00 | 1 |
1/28/2025 0:00 | 1 |
1/29/2025 0:00 | 1 |
1/30/2025 0:00 | 1 |
1/31/2025 0:00 | 1 |
2/1/2025 0:00 | 1 |
2/2/2025 0:00 | 1 |
2/3/2025 0:00 | 1 |
2/4/2025 0:00 | 1 |
2/5/2025 0:00 | 1 |
2/6/2025 0:00 | 1 |
2/7/2025 0:00 | 1 |
2/8/2025 0:00 | 1 |
2/9/2025 0:00 | 1 |
2/10/2025 0:00 | 0 |
2/11/2025 0:00 | 0 |
2/12/2025 0:00 | 0 |
2/13/2025 0:00 | 0 |
2/14/2025 0:00 | 1 |
2/15/2025 0:00 | 1 |
2/16/2025 0:00 | 1 |
2/17/2025 0:00 | 1 |
2/18/2025 0:00 | 1 |
2/19/2025 0:00 | 1 |
2/20/2025 0:00 | 1 |
2/21/2025 0:00 | 1 |
2/22/2025 0:00 | 1 |
2/23/2025 0:00 | 1 |
2/24/2025 0:00 | 1 |
2/25/2025 0:00 | 1 |
2/26/2025 0:00 | 1 |
2/27/2025 0:00 | 1 |
2/28/2025 0:00 | 1 |
3/1/2025 0:00 | 1 |
3/2/2025 0:00 | 0 |
3/3/2025 0:00 | 0 |
3/4/2025 0:00 | 0 |
3/5/2025 0:00 | 0 |
3/6/2025 0:00 | 1 |
3/7/2025 0:00 | 0 |
3/8/2025 0:00 | 0 |
3/9/2025 0:00 | 0 |
3/10/2025 0:00 | 1 |
3/11/2025 0:00 | 1 |
3/12/2025 0:00 | 1 |
3/13/2025 0:00 | 1 |
3/14/2025 0:00 | 1 |
3/15/2025 0:00 | 1 |
3/16/2025 0:00 | 1 |
3/17/2025 0:00 | 1 |
3/18/2025 0:00 | 1 |
3/19/2025 0:00 | 1 |
3/20/2025 0:00 | 1 |
3/21/2025 0:00 | 1 |
3/22/2025 0:00 | 1 |
3/23/2025 0:00 | 1 |
3/24/2025 0:00 | 1 |
3/25/2025 0:00 | 1 |
3/26/2025 0:00 | 1 |
3/27/2025 0:00 | 1 |
3/28/2025 0:00 | 1 |
3/29/2025 0:00 | 1 |
3/30/2025 0:00 | 1 |
3/31/2025 0:00 | 1 |
4/1/2025 0:00 | 1 |
4/2/2025 0:00 | 1 |
4/3/2025 0:00 | 1 |
4/4/2025 0:00 | 1 |
4/5/2025 0:00 | 1 |
4/6/2025 0:00 | 1 |
4/7/2025 0:00 | 1 |
4/8/2025 0:00 | 1 |
4/9/2025 0:00 | 1 |
4/10/2025 0:00 | 1 |
4/11/2025 0:00 | 1 |
4/12/2025 0:00 | 1 |
4/13/2025 0:00 | 1 |
4/14/2025 0:00 | 1 |
4/15/2025 0:00 | 0 |
4/16/2025 0:00 | 0 |
4/17/2025 0:00 | 0 |
4/18/2025 0:00 | 0 |
4/19/2025 0:00 | 0 |
4/20/2025 0:00 | 0 |
4/21/2025 0:00 | 0 |
4/22/2025 0:00 | 0 |
4/23/2025 0:00 | 0 |
4/24/2025 0:00 | 0 |
4/25/2025 0:00 | 0 |
4/26/2025 0:00 | 0 |
4/27/2025 0:00 | 0 |
4/28/2025 0:00 | 0 |
4/29/2025 0:00 | 0 |
4/30/2025 0:00 | 0 |
5/1/2025 0:00 | 0 |
5/2/2025 0:00 | 0 |
5/3/2025 0:00 | 0 |
5/4/2025 0:00 | 0 |
5/5/2025 0:00 | 0 |
5/6/2025 0:00 | 0 |
5/7/2025 0:00 | 0 |
5/8/2025 0:00 | 0 |
5/9/2025 0:00 | 0 |
5/10/2025 0:00 | 0 |
5/11/2025 0:00 | 1 |
5/12/2025 0:00 | 1 |
5/13/2025 0:00 | 1 |
5/14/2025 0:00 | 1 |
5/15/2025 0:00 | 1 |
5/16/2025 0:00 | 1 |
5/17/2025 0:00 | 1 |
5/18/2025 0:00 | 1 |
5/19/2025 0:00 | 1 |
5/20/2025 0:00 | 1 |
5/21/2025 0:00 | 1 |
5/22/2025 0:00 | 1 |
5/23/2025 0:00 | 1 |
5/24/2025 0:00 | 1 |
5/25/2025 0:00 | 1 |
5/26/2025 0:00 | 1 |
5/27/2025 0:00 | 1 |
5/28/2025 0:00 | 1 |
5/29/2025 0:00 | 0 |
Solved! Go to Solution.
hello @AMONT
Glad it is worked as intended.
if you need to limit certain id, you can add more condition inside the filter.
for example:
MAXX(
FILTER(
ALL('Table'),
'Table'[DateOnly]<_Select&&
'Table'[Well_ID]=_ID *** adding at top var _ID = SELECTEDVALUE('Table'[Well_ID])***
),
'Table'[DateOnly]
)
or directly use should have same result :
MAXX(
FILTER(
ALL('Table'),
'Table'[DateOnly]<_Select&&
'Table'[Well_ID]=SELECTEDVALUE('Table'[Well_ID])
),
'Table'[DateOnly]
)
This way, the measure only calculate for previous date AND for same ID.
you can add more filter condition as you need as above.
Hope this will help.
Thank you.
Hi @AMONT ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions.
Thank you.
Hi @AMONT ,
Thank you for reaching out to the Microsoft fabric community forum.
Could you please confirm if the issue has been resolved. I wanted to check if you had the opportunity to review the information provided by @Irwan .Please feel free to contact us if you have any further questions.
Thank you.
hello @AMONT
i am not sure but the second value is written 6/24/24-10/24/24, but from your data it should be 6/24/24-10/20/24
Regardless, please check if this accomodate your need.
you can do this with PQ or DAX.
- using PQ
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddhLjhwhEATQq1iztkSTQAG+iuX7X8OFPz1y+aU0q9j0RNXrAPX37x+jxOv+i/7l9e31+vj68fr48fVXXP+J6984HDfH3fFwfDmejpfjzbi9HKvlVZyq+lXU/CoqfhX1vopqX0Wtr6LSV1Hnu4U633HS2fWq+1UXrG5YXbFO0Lvj5Xgzpt/r6fcdh+Pm2C3p97Lfy34v+73s97LfSaiTUCehTkKdhDoJdRLqJNRJqNNQp6FOQ52GOg11Guo01PmE+o5dsrpluCWHdnpop4d2Guo01Gmo01CnoU5DnRlUDu2i30W/i34X/S76XfS76HfR76LfZb/Lfpf9Lvtd9rvsd9nvst9lv8t+l/0u+132u+x32e+y32W/y36X/S77Xfa77HfT76bfTb+bfjf9bvrd9Lvpd9Pvtt9tv9t+t/1u+932u+132++2322/2363/W773fa77Xfb77bfbb/bfrf9bvu9X7BanvfuWC3vWC3vWC3vWC3vWC3vWC3vWC1Pnaxm0pNoT540JduTJ10J9+RJW9I9edKXeE+ua+7Jdc89uS66J+9JPpL8SnJd6U+uO/3Jdak/AHWrPzn71pLEfAq18CHUwmdQCx9BLXwCtfABVPOu5l0T3jXhXRPeNeFdE9414V0T3jXhXRPeNeFdE97V43zypC/n+eRJXw70yZO+nOiTJ3050gcg+4ZXOrzS4ZUOr3SYcZhxmHEUfo2j8FscT8afOb+u8WT8mfMLG0/Gn3nStSZl+UPEyZO6/Cni1z+f5Elfr3QkKx3JSkey0pGsdCQrHclKR7LSkTLmbfnPSI9nGkzbP+mfT/xN+790ML2YTqaL6Wb6G/X/cXUcjt2vumB1w+qK1R2rS1a3DLcMtwy3DLcMt4xhH5fj6Xg53lb2ciypQb9Bv/Hw+zftTNU7impHUesoKh1FncN+w37DfsN+4+n3/ZFuWF2xumN1yeqWoTcbT7/v2C8x/BbDLek37DfsN+y3kV4rejeN09k4nY3T2UivcTobp7NxOtuT3ruc21W9lPak9471UprpNdNrptdMr5leM71mes30muk102um10yvmV7L6HE6m6ezeTo7/XZOZ+d0dk5n53R2+u2czs7p7JzObr/dfrv9dvvt9tt99Hcf/d1Hf/fR3330dx/93Ud/99HfffR3H/396fcdu2W4ZbhluGVTy1FUcnBoB4d2cGgHh3bwjjo4tINDOzi0w2f8MNRhqMNQh6EOD+3w0A4P7fDQDg/teEJ9x27JoR0e2uGhHR7a4aEdHtrhoR2A+uMn", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [DateOnly = _t, #"Total Days With 3 Exceed Ref" = _t]),
#"Extracted Text Before Delimiter" = Table.TransformColumns(Source, {{"DateOnly", each Text.BeforeDelimiter(_, " "), type text}}),
#"Changed Type with Locale" = Table.TransformColumnTypes(#"Extracted Text Before Delimiter", {{"DateOnly", type date}}, "en-US"),
#"Changed Type" = Table.TransformColumnTypes(#"Changed Type with Locale",{{"DateOnly", Int64.Type}}),
#"Sorted Rows" = Table.Sort(#"Changed Type",{{"DateOnly", Order.Ascending}}),
#"Added Index" = Table.AddIndexColumn(#"Sorted Rows", "Index", 0, 1, Int64.Type),
#"Added Index1" = Table.AddIndexColumn(#"Added Index", "Index.1", 1, 1, Int64.Type),
#"Merged Queries" = Table.NestedJoin(#"Added Index1", {"Index"}, #"Added Index1", {"Index.1"}, "Added Index1", JoinKind.LeftOuter),
#"Expanded Added Index1" = Table.ExpandTableColumn(#"Merged Queries", "Added Index1", {"Total Days With 3 Exceed Ref"}, {"Added Index1.Total Days With 3 Exceed Ref"}),
#"Added Custom" = Table.AddColumn(#"Expanded Added Index1", "Custom", each if [Total Days With 3 Exceed Ref]<>[Added Index1.Total Days With 3 Exceed Ref] then 1 else 0),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([Custom] = 1) and ([Added Index1.Total Days With 3 Exceed Ref] <> null)),
#"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows",{"DateOnly", "Total Days With 3 Exceed Ref"}),
#"Added Custom1" = Table.AddColumn(#"Removed Other Columns", "Start", each if [Total Days With 3 Exceed Ref]="1" then [DateOnly] else null),
#"Filled Down" = Table.FillDown(#"Added Custom1",{"Start"}),
#"Added Custom2" = Table.AddColumn(#"Filled Down", "End", each if [Total Days With 3 Exceed Ref]="0" then [DateOnly]-1 else null),
#"Filled Up" = Table.FillUp(#"Added Custom2",{"End"}),
#"Removed Duplicates" = Table.Distinct(#"Filled Up", {"Start"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Removed Duplicates",{{"Start", type date}, {"End", type date}}),
#"Removed Other Columns1" = Table.SelectColumns(#"Changed Type1",{"Start", "End"})
in
#"Removed Other Columns1"
- using DA
create a new calculated column to get next date value.
Next Value =
var _Date =
MAXX(
FILTER(
'Table',
'Table'[DateOnly]<EARLIER('Table'[DateOnly])
),
'Table'[DateOnly]
)
Return
MAXX(
FILTER(
'Table',
'Table'[DateOnly]=_Date
),
'Table'[Total Days With 3 Exceed Ref]
)
DAX =
var _Sum =
SUMMARIZE(
FILTER(
'Table',
'Table'[Total Days With 3 Exceed Ref]<>'Table'[Next Value]
),
'Table'[DateOnly],
'Table'[Total Days With 3 Exceed Ref],
'Table'[Next Value]
)
var _Next =
ADDCOLUMNS(
ADDCOLUMNS(
_Sum,
"Next Date1",
MINX(
FILTER(
_Sum,
'Table'[DateOnly]>=EARLIER('Table'[DateOnly])&&
'Table'[Next Value]=1
),
'Table'[DateOnly]
)
),
"Datediff",
DATEDIFF(
[DateOnly],
[Next Date1],
DAY
)
)
Return
SELECTCOLUMNS(
FILTER(
_Next,
[Datediff]>0
),
"Start",[DateOnly],
"End",[Next Date1]-1
)
Hi, thanks for taking a stab at this!
For the 10/20 vs 10/24, that was me, just a typo, your values are correct.
I went the DAX route first, and I get an error for a circular dependency on the first step. In my dataset, the Total Days With 3 Exceed Ref is a measure, and I think that's what causing it. That measure is the product of three previous measures, which is why I think this has become such a headache for me:
The following measures are what has resulted in Total Days with 3 Exceed Ref:
The first measure summarizes the column Exceeds Ref, which can have values ranging from 0 to 36, down to 0 to 3, as I have the hours filtered in my dataset.
hello @AMONT
here is the tweak if [Total Days With 3 Exceed Ref] as a measure.
you can continue this below step from creating measure for [Total Days With 3 Exceed Ref].
Right side of above screenshot is using [Total Days With 3 Exceed Ref] as a measure which has exact same result as using calculated column in left side.
As measure works in filter context, so you need to have 'DateOnly' column as filter or else you will have blank because there is no filter.
1. create a new measure for calculating end of conscutive date
Measure Changed Value =
var _Select = SELECTEDVALUE('Table'[DateOnly])
var _Date =
MAXX(
FILTER(
ALL('Table'),
'Table'[DateOnly]<_Select
),
'Table'[DateOnly]
)
var _Value =
MAXX(
FILTER(
ALL('Table'),
'Table'[DateOnly]=_Date
),
[Measure Total Days With 3 Exceed Ref]
)
Return
IF(
[Measure Total Days With 3 Exceed Ref]<>_Value,
1,
0
)
Measure Min DateOnly =
MINX(
FILTER(
ALL('Table'),
'Table'[DateOnly]>=SELECTEDVALUE('Table'[DateOnly])&&
[Measure Changed Value]=1&&
[Measure Total Days With 3 Exceed Ref]=1
),
'Table'[DateOnly]
)
Measure Remove Duplicate =
IF(
[Measure Min DateOnly]=SELECTEDVALUE('Table'[DateOnly]),
1,
0
)
5. create a new measure for calculating number of consecutive date.
Measure Count =
DATEDIFF(
[Measure Min DateOnly],
[Measure Next Date],
DAY
)
First off, I appreciate all the time you've put into this. I owe you big time! I know this is so close to correct, but I have no idea what I'm doing wrong.
The [Measure Total Days With 3 Exceed Ref] doesn't work in my pbix. I cannot perform the SUM measure since it's not a column in the table, just a measure based on several other measures as described above. As such, I get a weird result in all the other formulas that you provided.
I think I need a measure to mimic your measure, but in the summation of the [Total Days With 3 Exceed Ref] measure that I have. In the Card that I added, it defaults to the sum, so I guess I need that in a measure to make the same result?
One thing to make sure is clear, this is a single well (Well 100) example. I have over 100 wells. So I am assuming I need the measure to be filtered by well number.
hello @AMONT
i think if [measure total days with exceed]is a measure, then you should have no issue.
The SUM in [measure total days with exceed] in my pbix is meant to make [measure total days with exceed] behaves as measure.
You can do your [measure total days with exceed] measure to continue your calculation.
If you still have wrong result perhaps you have another calculation that affect your final desired outcome.
Please share your pbix contained of a sample of your original data so we can work it out to achive your desired outcome.
Please remove any confidential information.
Thank you.
Hello,
I began to pull a subset of the data with the Well 100 example, and once that data was on it's own, without any other wells, the measures and everything worked just like your screenshots!
So it will definitely work, but I think I need things filtered by the [Well_Number] field that I have, as all the wells can have the same dates. In that case, with the measures that you provided, would I just update the filters in some way to make the measures calculate not only by DateOnly, but also by Well_Number?
I can pull a data subset with 2-3 wells if that would help.
hello @AMONT
Glad it is worked as intended.
if you need to limit certain id, you can add more condition inside the filter.
for example:
MAXX(
FILTER(
ALL('Table'),
'Table'[DateOnly]<_Select&&
'Table'[Well_ID]=_ID *** adding at top var _ID = SELECTEDVALUE('Table'[Well_ID])***
),
'Table'[DateOnly]
)
or directly use should have same result :
MAXX(
FILTER(
ALL('Table'),
'Table'[DateOnly]<_Select&&
'Table'[Well_ID]=SELECTEDVALUE('Table'[Well_ID])
),
'Table'[DateOnly]
)
This way, the measure only calculate for previous date AND for same ID.
you can add more filter condition as you need as above.
Hope this will help.
Thank you.
@Irwan Thank you so so so much! This was it. Thank you for basically just doing all my coding, but being able to step through this has also helped me with understanding the DAX. I'll mark this as solved!