Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
I was curious, when a custom SQL query is in the 'evaluation' phase, is this the query folding taking part on the server?
And when the rows of this query are being imported and this goes slow, is that then no longer a server problem, but rather an internet/gateway issue?
Solved! Go to Solution.
Just run Power Bi as administrator to get around this error.
As far as your assumptions, you are mostly correct.
When a function returns a table (e.g. SQL.Database) it can call Table.View, and pass it some handlers to return a table view. Once in the view, many operations on the table will invoke the appropiate handler, which will attempt to convert the operation provided to a native query. The RowExpression.From is used to decompose functions that are passed into Table.AddColumn, Table.SelectRows and Table.Group. If this succeedes, the function will usually return a new view to handle the next operation. If this fails, then Power Bi will attempt to retry one view level up. This can cause a major performance hit, and is something that is being fixed soon. If it fails at the highest level, it will take the lowest level that succeeded, import all the data into the model, and the proceed to use transform the data without folding.
For import mode, the above is done, the final step for each query is run and loaded into the model (Each query runs seperatly, completly independently of other ones). This may result in same type loss, as the Power query model is more type rich than dax is. Once in the SSAS model, dax processing happens as expected.
For direct query mode, only the schema is loaded into the model. When a user views the data, all filters will be translated into power query and the data source function will attempt to fold it. If this fails, then direct query fails. The data is then returned to the user. I belive there is a 10000 limit on the number of rows loaded per query in the model using this.
What folding does is it tries to translate your steps in Power Bi to a SQL query. You can right click any step in your query, and if the "View native query" is enabled, then query folding is taking place.
Hello @artemus ,
Thank you for your reply. However I was not looking for that piece of information.
What I was curious for is there are several mashup steps within the query process. There is the query folding/ extraction part (1) on the server of the source data. Then it is imported into the ETL part of Power BI (2) and eventually entered into the data model where DAX calculation will take part (3).
For example, when you manage to create a large inner join working as with query folding, the evaluating step (part 1) sometimes takes 30-40 minutes before data beginning to import. One such data source takes only 5 minutes to import the evaluated query, but another with the same size of data it take an hour (part 2). Also, with my report expanding in DAX expressions, I noticed my query to be fully loaded but taking quite some time to actually 'disappear' (part 3).
When I look at the behavior of querying data, my assumption was that during the 'evaluating' step of the query, part 1 takes into effect. Any slowness could be a poorly written SQL or a slow server. When the data would start to be loaded in, part 1 is done. Any slowness at this step would be the connection speed or gateway issues I would say. Then when the all queries are finally loaded and no more rows are loaded into the data model, part 3 takes into effect. The DAX calcualtions. Usually this is very small.
I was curious whether the assumption I make about the process here above is correct. I was hoping whether I could get some insight in the query process by watching to query progress and see how long each step roughly takes. So far as I can read, only part 2 has an analytical preview feature to analyze the effectiveness of a query.
Any chance you can shine some light on this?
Sure, just right click your last step in your query and hit diagnonsis. You can also enable logging in the options menu
I've tried, but it appears my permissions are not sufficient to run the diagnostics.
But can you perhaps tell me whether my assumptions are correct?
Just run Power Bi as administrator to get around this error.
As far as your assumptions, you are mostly correct.
When a function returns a table (e.g. SQL.Database) it can call Table.View, and pass it some handlers to return a table view. Once in the view, many operations on the table will invoke the appropiate handler, which will attempt to convert the operation provided to a native query. The RowExpression.From is used to decompose functions that are passed into Table.AddColumn, Table.SelectRows and Table.Group. If this succeedes, the function will usually return a new view to handle the next operation. If this fails, then Power Bi will attempt to retry one view level up. This can cause a major performance hit, and is something that is being fixed soon. If it fails at the highest level, it will take the lowest level that succeeded, import all the data into the model, and the proceed to use transform the data without folding.
For import mode, the above is done, the final step for each query is run and loaded into the model (Each query runs seperatly, completly independently of other ones). This may result in same type loss, as the Power query model is more type rich than dax is. Once in the SSAS model, dax processing happens as expected.
For direct query mode, only the schema is loaded into the model. When a user views the data, all filters will be translated into power query and the data source function will attempt to fold it. If this fails, then direct query fails. The data is then returned to the user. I belive there is a 10000 limit on the number of rows loaded per query in the model using this.
Thank you for your explanation.
My current model is by having an import with a customer written SQL query. From that point, all ETL processed only happens in DAX. This way I have more control over what happens and I can more easily change a logic.
Based on your explanation, I would say that the import stage would be the same with exception that it would behave like a pre-made direct query as source without limitation as there is only 1 step. Is this conclusion drawn from your explanation correct?
If you use import, then the entire contents of the query are loaded into your SSAS model every time you do a refresh. If this is your performance bottleneck, then the issue is how much data you are loading. Note that in the previewer, only the first 1000 rows are loaded.
Note that the query has to determine the schema being loaded, so if you have a custom sproc, it might not be easy to determine this purely based on static data.
The average file of mine is about 300-400 MB. The highest was 3.7 GB, till I slashed it to 0.9 GB. It saves a lot of time developing the reports when you don't have to run queries to see what you're working with. Have found plenty of... interesting cases in our system with this tactic.
Thanks for explaning the query steps Power BI makes. This will help me have a discussion with our IT where bottlenecks are. I'm afraid that we're going to ask to much server capacity by pushing all our calculations onto the server, although the gateway could be an interesting topic as well.
Is direct query an option for you? It could save a lot of downloading.
Also you could look into incremental refresh to avoid redownloading all the data.
As far as my experience goes with Direct Query it is not ideal for what I'm trying. Calculating in calculated columns with DAX allows for more flexibility than with a Direct Query. Besides, I'm looking mainly for flaws in the system.
The one with the big numbers was compressed on desktop 0.6 where incremental would lift it up till 3.7. Yea, I have my fair share of experience with larger datasets. Now with Power BI becoming more of a success, parts of IT starting to feel the strains :). Like having a car build for 100 mph. Then you put a large block into it and the tires, the brakes, the stearing and the structural integretiy of the car begin to suffer. Power BI feels a bit like putting that large block into your car.
If your set on loading a large amount of data, you could copy it to a blob and just download it (ideally not from your SQL server). Might take strain off your SQL server.
Not sure what a blob is in our context. Had a chat with IT yesterday and funny enough Power BI direct queries were already causing major problems for the server :). They created a dummy server with a 5 minute delay for this purpose.
Just speaking hypothetically, it would be more ideally if dataflows could be used as an import source. This way the whole organization can benefit from the data that is already behind the gateway saving on gateway and server capacity. Speaking of this, do dataflows take Power BI RAM for their calculations, because we're getting close to ours :).
Datatflows is basicly an abstraction around SSAS, which handles different kinds of imports, and lets you use the office365 security model for controlling access. While dataflows is a good feature rich experience, you should also balance it around SSAS directly as you do lose some of the finer controls on how the data is used. In paticular, you can define measures in SSAS directly for everyone to use, while in dataflows you can import much more than just SQL data.
As far as blobs are concerned, the most performant option would be to have all the data sent to an Azure storage account located in the same region as premium capactity or an SSAS server. But this really only works if you mostly do append on your data set, and not very much updates.
Thank you for the information. I'll pass on the information. Sounds a lot like not a simple solution problem :).
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.