Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Sign up nowVote for your favorite vizzies from the Power BI Dataviz World Championship submissions. Vote now!
Hi All,
I have a requirement to display report with only first entry of date where I have multiple entries. Other report with other than first entry.
Report having columns like Location, district, Permitnumber, date and numeric value fields.
Date field will have multiple date entries for each Location/Distirct.
Attaching .pbix and excel output of the report.
Thanks
Solved! Go to Solution.
Hi,
PBI file attached.
Hope this helps.
Hi MachS,
We are following up to see if what we shared solved your issue. If you need more support, please reach out to the Microsoft Fabric community.
Thank you.
Thankyou, @vojtechsima, @pcoley, @Ashish_Mathur, @krishnakanth240 and @danextian for your responses.
Hi MachS,
We appreciate your inquiry through the Microsoft Fabric Community Forum.
We would like to inquire whether have you got the chance to check the solutions provided by @vojtechsima, @pcoley and @Ashish_Mathur to resolve the issue. We hope the information provided helps to clear the query. Should you have any further queries, kindly feel free to contact the Microsoft Fabric community.
Thank you.
Hi @MachS
Only those with elevated access in the forum can attach files. You may post a link to pbix file stored in the cloud. Also, how do you identify which among those date entries should be picked? What are the criteria?
Hi @MachS
Can you please share data in form of a text so that we can copy it into Power BI to work. Also what exact output you are looking for.
Hi,
Share the download link of those files. Show the expected result very clearly.
hey, @MachS ,
create measure like this:
not min water level first filter =
VAR _distincTable =
VALUES ( 'table'[District ID] )
VAR _add =
ADDCOLUMNS (
_distincTable,
"minDate",
CALCULATE (
MIN ( 'table'[Date] ),
REMOVEFILTERS ( 'table'[Date] )
)
)
VAR _newtable =
CALCULATETABLE (
'table',
TREATAS ( _add, 'table'[District ID], 'table'[Date] )
)
VAR _output =
COUNTROWS ( EXCEPT ( 'table', _newtable ) )
RETURN
_output
Then construct your table and add it as filter and switch between 'is blank' or 'is not blank', depending on whether you want first or other than first values per District Id.
I attached the demo PBIX.
Unable to attache file, please find screen short for the sample data.
Hi All,
Thanks for all your interest.
I have managed to display only first Date records for each District ID with below DAX.
CALCULATE(MIN('FL|SWFWMD'[beginDate]),ALLEXCEPT('FL|SWFWMD','FL|SWFWMD'[beginDate],MonitoringLocationMetaData[SWFWMD District ID]))
Now the task is to display create another table with other than First date records.
Sample data:
District ID | Date | Remark Code Piezo | Water Level Value |
3339 | 12/2/2025 14:15 | Normal | 85.88000000000001 |
3339 | 12/8/2025 12:43 | Normal | 84.47000000000001 |
3339 | 12/16/2025 14:32 | Normal | 84.07000000000002 |
3339 | 12/22/2025 12:30 | Normal | 83.75000000000001 |
3339 | 12/30/2025 10:49 | Normal | 84.26000000000002 |
3338 | 12/2/2025 14:20 | Normal | 77.60800000000002 |
3338 | 12/8/2025 12:38 | Normal | 78.27800000000002 |
3338 | 12/16/2025 14:35 | Normal | 78.18800000000002 |
3338 | 12/22/2025 12:27 | Normal | 78.03800000000003 |
3338 | 12/30/2025 10:42 | Normal | 77.84800000000001 |
3337 | 12/2/2025 14:21 | Normal | 82.62400000000001 |
3337 | 12/8/2025 12:34 | Normal | 84.13400000000001 |
3337 | 12/16/2025 14:38 | Normal | 83.32400000000001 |
3337 | 12/22/2025 12:20 | Normal | 83.29400000000001 |
3337 | 12/30/2025 10:40 | Normal | 83.02400000000002 |
3336 | 12/5/2025 9:33 | Normal | 88.10000000000002 |
3336 | 12/8/2025 11:00 | Normal | 88.50000000000001 |
3336 | 12/17/2025 11:50 | Normal | 88.58000000000001 |
3336 | 12/22/2025 12:01 | Normal | 88.24000000000002 |
3336 | 12/30/2025 9:43 | Normal | 88.00000000000001 |
3282 | 12/4/2025 11:42 | Normal | 110.39000000000001 |
3282 | 12/11/2025 9:10 | Normal | 111.46000000000002 |
3282 | 12/18/2025 8:38 | Normal | 110.72000000000001 |
3282 | 12/22/2025 11:53 | Normal | 111.11000000000001 |
3282 | 12/29/2025 0:00 | Normal | 111.03000000000002 |
3281 | 12/4/2025 11:31 | Normal | 112.82000000000002 |
3281 | 12/11/2025 9:09 | Normal | 113.81000000000002 |
3281 | 12/18/2025 8:30 | Normal | 113.13000000000001 |
3281 | 12/22/2025 11:49 | Normal | 112.98000000000002 |
3281 | 12/29/2025 0:00 | Normal | 112.82000000000002 |
3280 | 12/4/2025 11:41 | Normal | 107.76000000000002 |
3280 | 12/11/2025 9:22 | Normal | 108.28000000000002 |
3280 | 12/18/2025 8:46 | Normal | 108.05000000000001 |
3280 | 12/22/2025 11:58 | Normal | 107.99000000000002 |
3280 | 12/29/2025 0:00 | Normal | 107.91000000000003 |
3278 | 12/3/2025 12:08 | Normal | 105.50000000000003 |
3278 | 12/11/2025 7:32 | Normal | 106.32000000000002 |
3278 | 12/17/2025 13:25 | Normal | 106.23000000000002 |
3278 | 12/22/2025 8:43 | Normal | 105.89000000000001 |
3278 | 12/29/2025 0:00 | Normal | 105.82000000000002 |
3228 | 12/4/2025 13:06 | Normal | 97.57000000000001 |
3228 | 12/11/2025 10:40 | Normal | 97.45000000000002 |
3228 | 12/18/2025 10:02 | Normal | 97.14000000000001 |
3228 | 12/22/2025 13:00 | Normal | 97.27000000000001 |
3228 | 12/29/2025 0:00 | Normal | 97.14000000000003 |
3227 | 12/4/2025 12:34 | Normal | 96.03000000000003 |
3227 | 12/11/2025 10:15 | Normal | 95.66000000000004 |
3227 | 12/18/2025 9:28 | Normal | 95.36000000000003 |
3227 | 12/22/2025 12:27 | Normal | 95.80000000000004 |
3227 | 12/29/2025 0:00 | Normal | 95.51000000000003 |
3226 | 12/3/2025 12:47 | Normal | 101.34000000000003 |
3226 | 12/11/2025 8:10 | Normal | 101.27000000000002 |
3226 | 12/17/2025 14:01 | Normal | 101.40000000000002 |
3226 | 12/22/2025 9:29 | Normal | 101.09000000000003 |
3226 | 12/29/2025 0:00 | Normal | 100.95000000000002 |
3225 | 12/3/2025 12:52 | Normal | 108.78000000000002 |
3225 | 12/11/2025 8:15 | Normal | 109.06000000000002 |
3225 | 12/17/2025 14:08 | Normal | 108.88000000000002 |
3225 | 12/22/2025 9:37 | Normal | 108.78000000000002 |
3225 | 12/29/2025 0:00 | Normal | 108.66000000000003 |
By using above DAX and out of sample data I am able to display first records of date as below
District ID | Mindatetest | Remark Code Piezo | Water Level Value |
1330 | 12/10/2025 0:00 | Normal | 204.42000000000002 |
1348 | 12/2/2025 0:00 | Not Accessible | 661.0200000000002 |
1349 | 12/2/2025 0:00 | Normal | 1048.76 |
1356 | 12/2/2025 0:00 | Normal | 1285.5800000000004 |
1357 | 12/2/2025 0:00 | Normal | 1282.0200000000004 |
1358 | 12/2/2025 0:00 | Normal | 1265.5600000000002 |
1359 | 12/2/2025 0:00 | Normal | 1263.4200000000003 |
1360 | 12/2/2025 0:00 | Normal | 1281.58 |
1390 | 12/4/2025 0:00 | Normal | 1215.7600000000004 |
1395 | 12/3/2025 0:00 | Normal | 1167.24 |
1404 | 12/3/2025 0:00 | Not Accessible | 998.6800000000001 |
1405 | 12/3/2025 0:00 | Not Accessible | 982.1833333333334 |
1407 | 12/3/2025 0:00 | Normal | 1059.5000000000002 |
1408 | 12/3/2025 0:00 | Normal | 1030.44 |
1409 | 12/3/2025 0:00 | Normal | 1031.2000000000003 |
1410 | 12/3/2025 0:00 | Normal | 1037.44 |
1411 | 12/3/2025 0:00 | Normal | 1093.9900000000002 |
1412 | 12/3/2025 0:00 | Normal | 1010.6000000000003 |
1413 | 12/3/2025 0:00 | Normal | 572.0000000000002 |
1414 | 12/3/2025 0:00 | Normal | 531.1600000000002 |
Requirement now is as expected:
District ID | Date Meter Read | Remark Code | Water Level Value |
1348 | 12/9/2025 | Not Accessible |
|
1348 | 12/16/2025 | Not Accessible |
|
1348 | 12/22/2025 | Not Accessible |
|
1348 | 12/29/2025 | Not Accessible |
|
1349 | 12/9/2025 | Normal | 105.31 |
1349 | 12/16/2025 | Normal | 104.85 |
1349 | 12/22/2025 | Normal | 105.11 |
1349 | 12/29/2025 | Normal | 104.41 |
1356 | 12/10/2025 | Normal | 128.94 |
1356 | 12/15/2025 | Normal | 128.68 |
1356 | 12/22/2025 | Normal | 128.35 |
1356 | 12/29/2025 | Normal | 128.24 |
1357 | 12/10/2025 | Normal | 128.55 |
1357 | 12/15/2025 | Normal | 128.33 |
1357 | 12/22/2025 | Normal | 128.07 |
1357 | 12/29/2025 | Normal | 127.93 |
1358 | 12/10/2025 | Normal | 126.86 |
1358 | 12/15/2025 | Normal | 126.65 |
1358 | 12/22/2025 | Normal | 126.43 |
1358 | 12/29/2025 | Normal | 126.31 |
1359 | 12/10/2025 | Normal | 126.51 |
1359 | 12/15/2025 | Normal | 126.43 |
1359 | 12/22/2025 | Normal | 126.25 |
1359 | 12/29/2025 | Normal | 126.15 |
1360 | 12/10/2025 | Normal | 128.4 |
1360 | 12/15/2025 | Normal | 128.25 |
1360 | 12/22/2025 | Normal | 128.11 |
1360 | 12/29/2025 | Normal | 127.99 |
1390 | 12/11/2025 | Normal | 121.69 |
1390 | 12/17/2025 | Normal | 121.32 |
1390 | 12/23/2025 | Normal | 121.47 |
1390 | 12/30/2025 | Normal | 121.16 |
Hope this is clear.
Thanks
Please try whit the Coalesce formula:
COALESCE(<expression1>, <expression2>, ..., <expressionN>)
COALESCE (
CALCULATE (
MIN ( 'FL|SWFWMD'[beginDate] ),
ALLEXCEPT (
'FL|SWFWMD',
'FL|SWFWMD'[beginDate],
MonitoringLocationMetaData[SWFWMD District ID]
)
),
"NotAvailable"
)
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster.
As i can see you dont have any field to ensure that your metric corresponds with the first entry.
You should have and index field (you should create one in your etl steps in powerquery; taking into a count to buffer the table if needed -pq does not have a default order so you should incorporate the index at the early steps or buffering the table-)
Then in the report use a metric with calculate to select the appropriate row filtering the table with the minimun value in the index field of the respective date
For example:
First WaterLevel Opt1 =
CALCULATE(
SELECTEDVALUE('YourTable'[WaterLevel]),
FILTER(
'YourTable',
'YourTable'[Index] =
CALCULATE(
MIN('YourTable'[Index]),
ALLEXCEPT('YourTable', 'YourTable'[Date])
)
)
)
First WaterLevel opt2 =
SUMX(
VALUES('YourTable'[Date]),
CALCULATE(
SELECTEDVALUE('YourTable'[WaterLevel]),
'YourTable'[Index] =
CALCULATE(
MIN('YourTable'[Index]),
ALLEXCEPT('YourTable', 'YourTable'[Date])
)
)
)i hope this helps, if so kudos are welcome
@MachS
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster.
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 65 | |
| 65 | |
| 45 | |
| 21 | |
| 18 |
| User | Count |
|---|---|
| 115 | |
| 114 | |
| 38 | |
| 36 | |
| 26 |