Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
VanThuan
Frequent Visitor

The resultset of a query to external data source has exceeded the maximum allowed size of 1000000

Hello,

I have a fact table set to Import mode, and most of the dimension tables are also in Import mode, except for one dimension table containing sensitive data, which is in DirectQuery mode.

When I create a visual that includes a field from the DirectQuery dimension table and a measure from the fact table (Import mode), I encounter the following error:

"The resultset of a query to external data source has exceeded the maximum allowed size of ‘1000000’."

Do you have any suggestions or solutions to resolve this issue?

Thank you!

2 ACCEPTED SOLUTIONS
v-veshwara-msft
Community Support
Community Support

Hi @VanThuan ,

Thanks for posting in Microsoft Fabric Community.

DirectQuery has a fixed limit of 1,000,000 rows per query. This usually has no practical impact, but it can occur when Power BI requests an intermediate result that exceeds the limit, such as when building a visual before filters are applied. The error returned is: "The resultset of a query to external data source has exceeded the maximum allowed size of '1000000' rows."

Reference: DirectQuery limitations 

 

As @MasonMA  mentioned, adjusting the msmdsrv.ini configuration or pre-aggregating tables(practical) are possible approaches. You can also limit rows returned by applying filters or slicers, optimize DAX measures to reduce intermediate results, or test with smaller subsets.

 

You might find these resources useful:

Analysis Services DAX properties | Microsoft Learn

How to configure workloads in Power BI Premium - Microsoft Fabric | Microsoft Learn

 

Hope this helps. Please reach out for further assistance.

Thank you.

View solution in original post

Poojara_D12
Super User
Super User

Hi @VanThuan 

What’s happening here is a classic mixed-mode scenario in Power BI: you have most of your model in Import mode, but a sensitive dimension table in DirectQuery. When you build a visual that combines fields from the DirectQuery dimension with measures from the Import-mode fact, Power BI is forced into a composite model join. In this case, the engine pushes a query to the DirectQuery source to fetch the matching rows, but because it needs to bring that result set into memory for joining with the Import data, it can easily exceed the 1,000,000 row limit imposed on DirectQuery intermediate results. That’s why you see this error. To address it, you can: (1) filter or aggregate the DirectQuery table as much as possible before combining, so fewer rows flow into the join; (2) redesign the model so that the sensitive dimension doesn’t sit in DirectQuery but perhaps in Import mode with RLS (Row-Level Security) applied for protection; (3) create an aggregated or masked version of the sensitive table in Import mode for reporting, while keeping the full table in DirectQuery only for drill-throughs; or (4) if you must keep DirectQuery, limit visuals to only necessary fields and avoid cross-filtering against large Import tables. Essentially, the fix is to reduce the DirectQuery result set size or change the modeling approach so that Power BI isn’t forced to materialize such a large intermediate join.

 

Here’s a practical step-by-step modeling approach you can use to protect the sensitive dimension while avoiding the DirectQuery intermediate row-limit: 1) Audit reporting needs — list exactly which columns and grain your regular reports require (often you don’t need full PII or row-level detail for everyday visuals). 2) Create a sanitized reporting copy at the source or in your ETL (a SQL view or ETL job) that removes/masks sensitive columns (or aggregates them to the required grain) but preserves the dimension key; load that sanitized table into Power BI in Import mode and join it to your Import fact table — this keeps normal visuals as fast Import×Import joins and prevents huge DirectQuery resultsets. 3) Keep the full sensitive table only in DirectQuery for controlled use (do not use it in everyday pages): restrict its usage to admin/secure reports or drill-through scenarios, and lock those reports/workspaces to small AD groups. 4) Provide secure detail paths — when users truly need the raw sensitive rows, give them a dedicated detail report (or paginated report) that uses DirectQuery only and is placed in a tightly permissioned workspace, or implement parameterized drill-throughs that pass a single ID so the DirectQuery returns a tiny, focused result set. 5) Use aggregations and incremental refresh — publish pre-aggregated fact tables (month/quarter) and configure aggregation mappings so most visuals are answered from Import aggregates; enable incremental refresh on Import tables so the sanitized copy stays current without heavy loads. 6) Push security to the source where possible — implement source-side RLS or dynamic data masking (or views that enforce predicates) so only authorized identities can see full sensitive values, reducing risk even if the DirectQuery surface is available. 7) Test and monitor — use Performance Analyzer / SQL traces to confirm the DirectQuery queries are small and tune filters or views if any visual still generates large intermediate resultsets.

This pattern keeps everyday reporting performant (avoids the 1,000,000 row DirectQuery spill) while preserving access to full data for only secure, audited workflows. If you want, I can draft an example masked SQL view and the Import/DirectQuery pattern (plus a sample drill-through parameter) you can adapt to your environment.

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS

View solution in original post

7 REPLIES 7
v-veshwara-msft
Community Support
Community Support

Hi @VanThuan ,

We wanted to kindly follow up regarding your query. If you need any further assistance, please reach out.
Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @VanThuan ,

Just checking in to see if you query is resolved and if any responses were helpful.
Otherwise, feel free to reach out for further assistance.

Thank you.

Poojara_D12
Super User
Super User

Hi @VanThuan 

What’s happening here is a classic mixed-mode scenario in Power BI: you have most of your model in Import mode, but a sensitive dimension table in DirectQuery. When you build a visual that combines fields from the DirectQuery dimension with measures from the Import-mode fact, Power BI is forced into a composite model join. In this case, the engine pushes a query to the DirectQuery source to fetch the matching rows, but because it needs to bring that result set into memory for joining with the Import data, it can easily exceed the 1,000,000 row limit imposed on DirectQuery intermediate results. That’s why you see this error. To address it, you can: (1) filter or aggregate the DirectQuery table as much as possible before combining, so fewer rows flow into the join; (2) redesign the model so that the sensitive dimension doesn’t sit in DirectQuery but perhaps in Import mode with RLS (Row-Level Security) applied for protection; (3) create an aggregated or masked version of the sensitive table in Import mode for reporting, while keeping the full table in DirectQuery only for drill-throughs; or (4) if you must keep DirectQuery, limit visuals to only necessary fields and avoid cross-filtering against large Import tables. Essentially, the fix is to reduce the DirectQuery result set size or change the modeling approach so that Power BI isn’t forced to materialize such a large intermediate join.

 

Here’s a practical step-by-step modeling approach you can use to protect the sensitive dimension while avoiding the DirectQuery intermediate row-limit: 1) Audit reporting needs — list exactly which columns and grain your regular reports require (often you don’t need full PII or row-level detail for everyday visuals). 2) Create a sanitized reporting copy at the source or in your ETL (a SQL view or ETL job) that removes/masks sensitive columns (or aggregates them to the required grain) but preserves the dimension key; load that sanitized table into Power BI in Import mode and join it to your Import fact table — this keeps normal visuals as fast Import×Import joins and prevents huge DirectQuery resultsets. 3) Keep the full sensitive table only in DirectQuery for controlled use (do not use it in everyday pages): restrict its usage to admin/secure reports or drill-through scenarios, and lock those reports/workspaces to small AD groups. 4) Provide secure detail paths — when users truly need the raw sensitive rows, give them a dedicated detail report (or paginated report) that uses DirectQuery only and is placed in a tightly permissioned workspace, or implement parameterized drill-throughs that pass a single ID so the DirectQuery returns a tiny, focused result set. 5) Use aggregations and incremental refresh — publish pre-aggregated fact tables (month/quarter) and configure aggregation mappings so most visuals are answered from Import aggregates; enable incremental refresh on Import tables so the sanitized copy stays current without heavy loads. 6) Push security to the source where possible — implement source-side RLS or dynamic data masking (or views that enforce predicates) so only authorized identities can see full sensitive values, reducing risk even if the DirectQuery surface is available. 7) Test and monitor — use Performance Analyzer / SQL traces to confirm the DirectQuery queries are small and tune filters or views if any visual still generates large intermediate resultsets.

This pattern keeps everyday reporting performant (avoids the 1,000,000 row DirectQuery spill) while preserving access to full data for only secure, audited workflows. If you want, I can draft an example masked SQL view and the Import/DirectQuery pattern (plus a sample drill-through parameter) you can adapt to your environment.

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS
v-veshwara-msft
Community Support
Community Support

Hi @VanThuan ,

Thanks for posting in Microsoft Fabric Community.

DirectQuery has a fixed limit of 1,000,000 rows per query. This usually has no practical impact, but it can occur when Power BI requests an intermediate result that exceeds the limit, such as when building a visual before filters are applied. The error returned is: "The resultset of a query to external data source has exceeded the maximum allowed size of '1000000' rows."

Reference: DirectQuery limitations 

 

As @MasonMA  mentioned, adjusting the msmdsrv.ini configuration or pre-aggregating tables(practical) are possible approaches. You can also limit rows returned by applying filters or slicers, optimize DAX measures to reduce intermediate results, or test with smaller subsets.

 

You might find these resources useful:

Analysis Services DAX properties | Microsoft Learn

How to configure workloads in Power BI Premium - Microsoft Fabric | Microsoft Learn

 

Hope this helps. Please reach out for further assistance.

Thank you.

Hi @VanThuan ,
Just wanted to check if the responses provided were helpful. If further assistance is needed, please reach out.
Thank you.

audreygerred
Super User
Super User

Could you provide some background as to why the table with sensitive data is in direct query?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!





MasonMA
Community Champion
Community Champion

@VanThuan 

 

SQLBI published an article years ago with solution that involves editing the msmdsrv.ini configuration file. 

https://www.sqlbi.com/articles/tuning-query-limits-for-directquery/#:~:text=When%20this%20happens%2C...

 

But if the above configuration not helping, i also found this post from Ritaf1983 that would guide you building Pre-aggregated tables in Power Query, which would be more practical. 

https://community.fabric.microsoft.com/t5/Desktop/The-resultset-of-a-query-to-external-data-source-h...

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors