Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Dear All,
A couple of days ago I posted a topic for help with a measure which calculates the number of open incidents per day. I've already made some small adjustments since then.
Objective of this measure is to calculate to number of incidents where the last status in the filter context of the visual is not closed.
The measure I created works in the sense that it shows the correct results. However, already in my test-environment which is only ~15.000 rows, the performance isn't great.
When I attempt to run the measure in my live environment with ~15.000.000 rows, it's completely un-useable, as in, I just get an error that it runs out of memory.
As far as I can see there are 2 issues:
I'm however at a loss on how to calculate this result in a different way. What would be the correct way to re-write this measure so that it's usable in a larger data-model?
For reference, my fact table (Raw_Transactions) looks something like the screenshots below.
Only connected table is a date table called '_date'.
Thanks in advance for the assistance!
Open Incidents (Summarize) =
VAR MaxDate = MAX('_Date'[Date])
VAR Result =
CALCULATE(
COUNTX(
ADDCOLUMNS(
SUMMARIZE(
Raw_Transactions,
Raw_Transactions[IncidentId],
Raw_Transactions[CountOfTransaction]
),
"@MaxTransaction",
VAR _Result =
CALCULATE(
MAX( Raw_Transactions[CountOfTransaction] ),
ALLEXCEPT( Raw_Transactions, Raw_Transactions[IncidentId] ),
_Date[Date] <= MaxDate
)
RETURN
IF(
_Result = Raw_Transactions[CountOfTransaction],
_Result
)
),
[@MaxTransaction]
),
_Date[Date] <= MaxDate,
Raw_Transactions[TransactionType] <> "Closed"
)
RETURN
Result
Solved! Go to Solution.
Hi @Thogen ,
Thanks for the reply from lbendlin .
Your current requirement is that your measure can achieve the function, but the performance is too poor when there are 15,000 data. You are considering how to optimize the performance. Is my understanding correct?
I suggest you consider the following aspects:
For more detailed guidance, please see:
Optimization guide for Power BI - Power BI | Microsoft Learn
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @Thogen ,
Thanks for the reply from lbendlin .
Your current requirement is that your measure can achieve the function, but the performance is too poor when there are 15,000 data. You are considering how to optimize the performance. Is my understanding correct?
I suggest you consider the following aspects:
For more detailed guidance, please see:
Optimization guide for Power BI - Power BI | Microsoft Learn
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Look at the Query plan too, especially the Records column there. That may indicate where you have cartesian products.
Please provide sample data that fully covers your issue.
Please show the expected outcome based on the sample data you provided.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
14 | |
10 | |
10 | |
9 | |
9 |
User | Count |
---|---|
20 | |
13 | |
12 | |
11 | |
8 |