Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
kpia
Helper I
Helper I

Custom table visual max allowed size

Hi, 

 I am developing a custom table visual to create a writeback solution. I am using Direct Query and a Databricks SQL warehouse as my backend. I keep running into "The resultset of a query to external data source has exceeded the maximum allowed size of '1000000' rows." error.

 

I set the data reduction algorithm to 100 rows and still get this error. I load a bunch of columns from my fact table but as soon as I add a related column from my dim table I get this error. 

 

Any idea what can cause this?

2 REPLIES 2
kpia
Helper I
Helper I

@dm-p Thank the response, I checked the query generated by the visual and the query that is ran in databricks. 

 

The DAX TOPN query defines N as the amount specified in the data reduction algo + 1 so in my case it's 101.

 

The strange thing is that the query ran in Databricks as has limit of 1,000,001 rows defined no matter what my data redcution algo is set to but when I remove the dimension field the databricks query shows the limit set to 100. 

 

Additionally, what is strange is that the cardinality between the fact and dim table is 46 so it's really small. 

 

Same limit is hit with the standard table so I am guessing this is feature and not an issue with my code.

 

 

 

 

dm-p
Super User
Super User

Hi @kpia,

 

The data reduction algorithm should apply a TOPN in the generated query in the DAX that is generated by Power BI, but it would look like adding the dimension causes some issue with either the query that Power BI generates on your visual's behalf (or the query that Power BI does from the DAX to the back-end). It may need to get more information from the remote data source and this could be breaking the row limit that DirectQuery has. As the data reduction algorithm is the only means you have to limit the outgoing query from a custom visual, this may be a "feature" of DirectQuery and/or how Power BI accesses your data in this scenario.

 

The easiest thing to do would be to see if you can profile both the DAX being generated by your visual (through Performance Analyzer) and the query being run against Databricks and see what the resulting dataset looks like (I don't have much experience on this part to advise how to do it). If this is >1M rows, then it suggests that this is indeed a DirectQuery-related scenario. FWIW if you can replicate this with the core table visual, it may reinforce the limitation, as this visual has its own data reduction algorithm and will paginate additional queries after the first query has returned its data.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!


My course: Introduction to Developing Power BI Visuals


On how to ask a technical question, if you really want an answer (courtesy of SQLBI)




Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.