The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Good Morning,
I have a table that contains all of the leave history for my organisation including a start and end date of the leave. What i need to do is have a slicer where a manager can select a date range which will filter a table to show any leave history where one of the days falls within that date range.
e.g if they select May 2025 and some one has a leave entry starting 1 January 2025 and ending 30 June 2025 they should show up in this list.
Now i've done this many time before using a custom measure similar to below and a date table. However, in this case im getting an error stating that the query is hitting the max resources limit of 1 gig memory and failing. To overcome this i've flattened the data and created one line per date of the leave in the fabric data warehouse as a view. This has overcome the issue but every change of the slicer date range leads to an appro 25-30 second refresh time.
Relevent info:
- This report is using direct Query to a Fabric Data Warehouse.
- This is in a Fabric trial capacity (i beleive thats F64)
- Leave history table has 12,700 lines which i dint think was too much?
- Flattened table has 71,498
Custom measure:
Filter_Date_Range_Hist =
VAR SelectedStart = MIN('Date Table - Leave History'[Date])
VAR SelectedEnd = MAX('Date Table - Leave History'[Date])
RETURN
CALCULATE(
COUNTROWS(q2employee_leave_history),
FILTER(
q2employee_leave_history,
q2employee_leave_history[date_start] <= SelectedEnd &&
q2employee_leave_history[date_end] >= SelectedStart
)
)
Question: Is there another way to do it that more efficient? or is the flattening of the data the onyl way forward?
Subquestion: As im using a fabric trial (which i beleive is F64) im hitting a 1gb memory limit. Generally im using a PowerBi Pro work space, does that also have the 1gb limit.
Thanks for your help,
Greg
Solved! Go to Solution.
Try using summarize to reduce the size of the table you are iterating with filter.
FILTER(
Summarize(
q2employee_leave_history,
q2employee_leave_history[date_start],
q2employee_leave_history[date_end]
),
q2employee_leave_history[date_start] <= SelectedEnd &&
q2employee_leave_history[date_end] >= SelectedStart
)
Try using summarize to reduce the size of the table you are iterating with filter.
FILTER(
Summarize(
q2employee_leave_history,
q2employee_leave_history[date_start],
q2employee_leave_history[date_end]
),
q2employee_leave_history[date_start] <= SelectedEnd &&
q2employee_leave_history[date_end] >= SelectedStart
)
So Simple yet perfect.
Thank you so much. Preliminary tests shows this has worked for me.