Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hi all,
I am using a data set that has been added using import mode, so I am unable to use the Advance Editor to manipulate the data. Basically, my data set has some IDs that have varying volumes accross different days. What I'd like to is make a unique ID for group of dates that have the same volume. For example:
| ID | Date | Volume |
| 111 | 1/1/2025 | 700 |
| 111 | 2/1/2025 | 0 |
| 111 | 3/1/2025 | 0 |
| 111 | 4/1/2025 | 0 |
| 111 | 5/1/2025 | 200 |
| 111 | 6/1/2025 | 200 |
| 222 | 1/1/2025 | 50 |
| 222 | 2/1/2025 | 100 |
| 222 | 3/1/2025 | 100 |
| 222 | 4/1/2025 | 100 |
| 222 | 5/1/2025 | 500 |
| 222 | 6/1/2025 | 500 |
| 333 | 1/1/2025 | 100 |
| 333 | 2/1/2025 | 200 |
| 333 | 3/1/2025 | 200 |
| 333 | 4/1/2025 | 200 |
| 333 | 5/1/2025 | 500 |
| 333 | 6/1/2025 | 500 |
And what I would ideally love to see is:
| ID | Start Date | End Date | Sum of Volume |
| 111.v1 | 1/1/2025 | 1/1/2025 | 700 |
| 111.v2 | 2/1/2025 | 4/1/2025 | 0 |
| 111.v3 | 5/1/2025 | 6/1/2025 | 400 |
| 222.v1 | 1/1/2025 | 1/1/2025 | 50 |
| 222.v2 | 2/1/2025 | 4/1/2025 | 300 |
| 222.v3 | 5/1/2025 | 6/1/2025 | 1000 |
| 333.v1 | 1/1/2025 | 1/1/2025 | 100 |
| 333.v2 | 2/1/2025 | 4/1/2025 | 600 |
| 333.v3 | 5/1/2025 | 6/1/2025 | 1000 |
I know it is easy to do in Advanced Editor, but would it be possible at all in DAX?
Thank you!!
Solved! Go to Solution.
Click here to download a solution from Onedrive
Click here
How it works ....
Create a table with your test data
Group it
Group it again
Add a custom column
Remove all columns except the custom column
Expand the column
Add a customer column
Remove the unneeded columns and reorder columns
Please click [thumbs up] because I hace tried really hard.
And click [accept solution] it it works .... it clearly does work.
Thank you 😀
Hi @ohqpi ,
If the response addressed by @speedramps @burakkaragoz helped resolve your issue, it would be greatly appreciated if you could mark it as the Accepted Answer doing so helps others in the community who may be facing a similar challenge.
If you still have questions or need further assistance, feel free to share more details — we’re always here to support you.
Thanks again for being an active part of the Microsoft Fabric Community!
Hi @ohqpi
If the response from @speedramps or @burakkaragoz was helpful in resolving your issue, we would greatly appreciate it if you could mark it as the Accepted Answer. This not only assists you but also aids others in the community who might be encountering similar challenges.
Should you have any further questions or require additional assistance, please don’t hesitate to provide more details. We’re always here to help!
Thank you for being an engaged member of the Microsoft Fabric Community!
Hi @ohqpi
Thank you for your question! I understand that you're looking to create a unique ID for groups of dates that share the same volume in your imported dataset. However, since you're using Import mode, you won't have access to the Advanced Editor to manipulate the data directly.
If the above question is resolved your query accept the solution.
If you have any further questions, feel free to reach out.
Thank you for being a valued member of the Microsoft Fabric Community Forum!
Click here to download a solution from Onedrive
Click here
How it works ....
Create a table with your test data
Group it
Group it again
Add a custom column
Remove all columns except the custom column
Expand the column
Add a customer column
Remove the unneeded columns and reorder columns
Please click [thumbs up] because I hace tried really hard.
And click [accept solution] it it works .... it clearly does work.
Thank you 😀
Hi @ohqpi ,
Yes, you can approach this in DAX, but it requires a few calculated columns and measures since DAX does not offer row-by-row transformation like Power Query’s Advanced Editor. Here’s a way you can achieve similar results:
Step 1: Create a grouping key
First, create a calculated column to identify groups where the same ID has the same Volume across consecutive dates.
You’ll need to compare each row with the previous row for the same ID to detect where a new group starts.
GroupKey =
VAR PrevVolume =
CALCULATE(
MAX('YourTable'[Volume]),
FILTER(
'YourTable',
'YourTable'[ID] = EARLIER('YourTable'[ID]) &&
'YourTable'[Date] = EARLIER('YourTable'[Date]) - 1
)
)
VAR NewGroup =
IF(
'YourTable'[Volume] <> PrevVolume || ISBLANK(PrevVolume),
1, 0
)
VAR GroupNumber =
CALCULATE(
SUMX(
FILTER(
'YourTable',
'YourTable'[ID] = EARLIER('YourTable'[ID]) &&
'YourTable'[Date] <= EARLIER('YourTable'[Date])
),
IF(
'YourTable'[Volume] <>
CALCULATE(
MAX('YourTable'[Volume]),
FILTER(
'YourTable',
'YourTable'[ID] = EARLIER('YourTable'[ID]) &&
'YourTable'[Date] = EARLIER('YourTable'[Date]) - 1
)
) || ISBLANK(
CALCULATE(
MAX('YourTable'[Volume]),
FILTER(
'YourTable',
'YourTable'[ID] = EARLIER('YourTable'[ID]) &&
'YourTable'[Date] = EARLIER('YourTable'[Date]) - 1
)
)
),
1, 0
)
)
)
RETURN
'YourTable'[ID] & "-" & GroupNumberStep 2: Calculate Start Date, End Date, and Sum of Volume
Now you can aggregate by this GroupKey:
You can do this either in a summary table or via measures in your visual.
Example Summary Table:
SummaryTable =
SUMMARIZE(
'YourTable',
[GroupKey],
"ID", MAX('YourTable'[ID]),
"Start Date", MIN('YourTable'[Date]),
"End Date", MAX('YourTable'[Date]),
"Sum of Volume", SUM('YourTable'[Volume])
)Note:
Let me know if you need help adapting the logic to your exact table or date format!
translation and formatting supported by AI
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 56 | |
| 47 | |
| 44 | |
| 20 | |
| 20 |
| User | Count |
|---|---|
| 73 | |
| 72 | |
| 34 | |
| 33 | |
| 31 |