Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and a 50 percent discount on exams.
Get startedEarn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
In the power bi service, I am getting the error for a table usinig Direct Query: "The query result is too large. Consider removing unused columns, adding visual filters, or reduce the number of string type columns. " I only have about 20,000 rows so can someone clarify what the limit is so I can try to tune my table accordingly?
Solved! Go to Solution.
Hi @lgroger
You can optimize the performance by below
1. The relational database source can be optimized in several ways, as described in the following bulleted list.
2. A DirectQuery model can be optimized in many ways, as described in the following bulleted list
https://docs.microsoft.com/en-us/power-bi/guidance/directquery-model-guidance#optimize-model-design
3. Optimize the report design:
4. The benefits of Import and DirectQuery models can be combined into a single model by configuring the storage mode of the model tables. The table storage mode can be Import or DirectQuery, or both, known as Dual. When a model contains tables with different storage modes, it is known as a Composite model. For more information, see Use composite models in Power BI Desktop.
There are many functional and performance enhancements that can be achieved by converting a DirectQuery model to a Composite model. A Composite model can integrate more than one DirectQuery source, and it can also include aggregations. Aggregation tables can be added to DirectQuery tables to import a summarized representation of the table. They can achieve dramatic performance enhancements when visuals query higher-level aggregates. For more information, see Aggregations in Power BI Desktop.
Alternative, you can use the performance analyzer to exam report element performance https://docs.microsoft.com/en-us/power-bi/desktop-performance-analyzer Or the dax studio: https://community.powerbi.com/t5/Desktop/VAR-and-FILTERS-combination-to-make-an-efficient-column/m-p...
Hi @lgroger
You can optimize the performance by below
1. The relational database source can be optimized in several ways, as described in the following bulleted list.
2. A DirectQuery model can be optimized in many ways, as described in the following bulleted list
https://docs.microsoft.com/en-us/power-bi/guidance/directquery-model-guidance#optimize-model-design
3. Optimize the report design:
4. The benefits of Import and DirectQuery models can be combined into a single model by configuring the storage mode of the model tables. The table storage mode can be Import or DirectQuery, or both, known as Dual. When a model contains tables with different storage modes, it is known as a Composite model. For more information, see Use composite models in Power BI Desktop.
There are many functional and performance enhancements that can be achieved by converting a DirectQuery model to a Composite model. A Composite model can integrate more than one DirectQuery source, and it can also include aggregations. Aggregation tables can be added to DirectQuery tables to import a summarized representation of the table. They can achieve dramatic performance enhancements when visuals query higher-level aggregates. For more information, see Aggregations in Power BI Desktop.
Alternative, you can use the performance analyzer to exam report element performance https://docs.microsoft.com/en-us/power-bi/desktop-performance-analyzer Or the dax studio: https://community.powerbi.com/t5/Desktop/VAR-and-FILTERS-combination-to-make-an-efficient-column/m-p...
I saw that post but I couldn't find anything in that linked article referring to the maximum data points a table will display is 3,500. I'm able to show some tables with 20,000 rows successfully so the max has to be over 3,500 cells.