Like in SSDT, we have Close and Apply without processing. Is there any way to do same thing in pbi desktop ?
Whenever we create something in Edit Query, it expects us to re process the table which is pain everytime if it is a big table.
Is microsoft thinking in that direction to make it in sync with SSDT process ?
Hi, I stumbled upon this exact same issue myself, and was supprised to find this is not a more mentioned issue.
What I ended up with is not an exact solution to your problem, but I think it may function as a decent workaround.
To limit the number of rows loaded and the local processing after you hit "Close and apply" in PBID, you can use parameters in your queries to filter the tables.
Lets say you got a logfile as a source with a datetime-field and value-field with entries every second for 10 years. You dont want to wait for all those rows to load and process locally - you want to push it to the cloud and let the magic happen there!
In such a case you should be able to avoid this by creating a datetime parameter, and then filter the query by this parameter to limit the number of rows. Hit "Close and Apply", do some report building - in a lightning fast and row-low environment - and publish. Then - after the report is published - change the parameter from the Power BI service and refresh the dataset.
You could probably even use an if-statement based on the parameter in your query to limit the query processing even further and create an identical but empty table instead. Yeah. It shouldn't have been necessary to do this, but until Microsoft creates the much needed "One-Click"-solution, we will have to make due.
Information about editing parameters in the Power BI service can be found here: https://docs.microsoft.com/en-us/power-bi/service-parameters.
The Query Editor is a tool to shape data. The dataset is the source that we will use to analyze. If we don't apply the changes, why should we do that?
Surely, we can close it without applying the changes.
I understand your point, but there can be a situation where you have to add/modify something in the edit query but dont want to process it as it takes a lot of time. Instead you want to process the entire model once all the changes are completed.
Specially considering the fact that if we deploy the model on AAS, why would someone like to process it in local?
Processing on local would be slower as compared to AAS.
I would suggest you create an idea here. The big models would consume many resources. That means the bill will be large if we do it in the AAS.
but when I publish it to AAS, again I will have to enable it right. And in that case again it will expect me to process the table.
So you mean during developement phase, I need to disable this. Once it is done we have to enable it again and publish it to cloud.
For example I have many queries some quite large from external sources and I have a lookup table I maintain to bridge in some dummy customers that arn't yet in our main DB that I append to the master data I pull in from our ODS server. If I want to just updated that table I go into the query editor and either delete or add the sort step at the end. When I close and apply that data soruces that use this lookup table without updating the other ones.
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.
130+ sessions, 130+ speakers, Product managers, MVPs, and experts. All about Power BI and Fabric. Attend online or watch the recordings.