Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Sign up nowGet Fabric certified for FREE! Don't miss your chance! Learn more
I have a data set which has around 5,00,000 to 6,00,000 records, whenever I filter any record in the data set it take takes huge time to filter the data set.
Is there any way the minimise the processing time?
Any help ?
So you need to figure out whether the issue is in the data retrieval from the source database or if it is in the rendering time. You can find this out by looking in the ExecutionLog3 view in the ReportServer database.
If most of the time is spent on Data Retrieval then you need to look at tuning your queries. If most of the time is spent on the rendering then you either need to look at simplifying the layout and formatting or reducing the number of rows. That is a very large number of rows and it sounds more like it is a data extract, not a report that someone would read through line by line. So you may be better off using a different approach if what you are doing is producing data extracts.
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 3 | |
| 2 | |
| 1 | |
| 1 | |
| 1 |
| User | Count |
|---|---|
| 6 | |
| 5 | |
| 4 | |
| 3 | |
| 3 |