Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
Hello,
I have a dataset, which is too large to be handled by PBI Desktop - it can be refreshed only throught the Gateway on PBI side. The current way of deployment works for me like that:
1. On PBI Desktop side I create parameter, which filters dataset to smaller size (example: SELECT TOP x rows).
2. I make development in PBI Desktop.
3. I publish dataset to PBI Service.
4. I change parameter so it does not filter my data anymore.
5. I refresh dataset.
And now if I want to create a change in the model (for example: change definition of a measure), I need to go through all the steps again. It means that between point 3 and 5 my reports show incorrect values.
Is there any way to update definition of a DAX model (example: add measure) without overwriting all data?
Thanks, Kamil
Solved! Go to Solution.
I believe you can update the model using Power BI Desktop, then use ALM toolkit to deploy the metadata differences (except for the parameter you changed) to the service. Take a look at: https://datamartin.ca/2022/09/29/metadata-only-deployment-using-alm-toolkit/
Hopefully this helps,
I believe you can update the model using Power BI Desktop, then use ALM toolkit to deploy the metadata differences (except for the parameter you changed) to the service. Take a look at: https://datamartin.ca/2022/09/29/metadata-only-deployment-using-alm-toolkit/
Hopefully this helps,
Blopez, thank you for that. It's a solution to my problem indeed. The only thing worth to mention is that AML toolkit utilizes XMLA endpoint, which is a premium feature.
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.