The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi All,
Like in SSDT, we have Close and Apply without processing. Is there any way to do same thing in pbi desktop ?
Whenever we create something in Edit Query, it expects us to re process the table which is pain everytime if it is a big table.
Is microsoft thinking in that direction to make it in sync with SSDT process ?
Regards,
Akash
Hi, I stumbled upon this exact same issue myself, and was supprised to find this is not a more mentioned issue.
What I ended up with is not an exact solution to your problem, but I think it may function as a decent workaround.
To limit the number of rows loaded and the local processing after you hit "Close and apply" in PBID, you can use parameters in your queries to filter the tables.
Lets say you got a logfile as a source with a datetime-field and value-field with entries every second for 10 years. You dont want to wait for all those rows to load and process locally - you want to push it to the cloud and let the magic happen there!
In such a case you should be able to avoid this by creating a datetime parameter, and then filter the query by this parameter to limit the number of rows. Hit "Close and Apply", do some report building - in a lightning fast and row-low environment - and publish. Then - after the report is published - change the parameter from the Power BI service and refresh the dataset.
You could probably even use an if-statement based on the parameter in your query to limit the query processing even further and create an identical but empty table instead. Yeah. It shouldn't have been necessary to do this, but until Microsoft creates the much needed "One-Click"-solution, we will have to make due.
Information about editing parameters in the Power BI service can be found here: https://docs.microsoft.com/en-us/power-bi/service-parameters.
Hi @akj2784,
The Query Editor is a tool to shape data. The dataset is the source that we will use to analyze. If we don't apply the changes, why should we do that?
Surely, we can close it without applying the changes.
Best Regards,
Dale
Hi Dale,
I understand your point, but there can be a situation where you have to add/modify something in the edit query but dont want to process it as it takes a lot of time. Instead you want to process the entire model once all the changes are completed.
Specially considering the fact that if we deploy the model on AAS, why would someone like to process it in local?
Processing on local would be slower as compared to AAS.
Regards,
Akash
Hi Akash,
I would suggest you create an idea here. The big models would consume many resources. That means the bill will be large if we do it in the AAS.
Best Regards,
Dale
but when I publish it to AAS, again I will have to enable it right. And in that case again it will expect me to process the table.
So you mean during developement phase, I need to disable this. Once it is done we have to enable it again and publish it to cloud.
For example I have many queries some quite large from external sources and I have a lookup table I maintain to bridge in some dummy customers that arn't yet in our main DB that I append to the master data I pull in from our ODS server. If I want to just updated that table I go into the query editor and either delete or add the sort step at the end. When I close and apply that data soruces that use this lookup table without updating the other ones.
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Power BI update to learn about new features.
User | Count |
---|---|
112 | |
79 | |
74 | |
50 | |
40 |
User | Count |
---|---|
135 | |
120 | |
75 | |
65 | |
64 |