The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
I have simple yet large table (10+ million rows), with the following fields
User, HoursWorked, Date , ProjectNumber, HourCode
My report should something like this:
User chooses a year (after this filter is applied it should leave around 500K rows)
A summary table which shows users and hours worked per month for that year (grouped by ProjectNumber)
A summary table which shows users and hours worked per month for that year (grouped by HourCode)
A detail table which shows the individual hours per day per hourcode per project if they click a user from one of the summary tables
This is all very simple I can make the report no problem if I use a small table with just a subset of the large one.
But how do I make this work with the large table, both directquery and import table take an unacceptable long time to load
Because it seems to load the entire 10m rows and then apply the year filter
Hi @Anonymous
I would suggest you add a sql query like the ways in this post if it connects to sql server.
Regards,
Cherie
Yeah, that is what is do now but that pulls in the 10m rows, I can't prefilter it, because it is user dependent.
The user chooses the period he wants to see but that happens after the data is pulled in.
Isn't there a way to prefilter the data based on a users input ?
Hi do you got any solution on this?