Hit Reply and let us know what you think of the DirectQuery for Power BI datasets and Azure Analysis Services. To learn more about this feature, please visit this blog post or our documentation.
Here are some areas that we'd like to hear about in particular:
Thanks and we look forward to hearing your feedback!
- The Power BI Modeling Team
This is an excellent feature that is very useful for us. But We are not able to change the existing power bi Datasets in Service to Composite model. We need to create a new dataset everytime? We have many reports that are embedded in our site and dont want any impact on the Report ID's
this feature is built to allow you to extend / enrich existing datasets. That would turn them into a composite. Not sure why you would want an existing dataset into a composite if you are not changing / extending it. If you added extension / enrichment you will be creating a new dataset and can leverage an API to "rewire" your reports if you were so inclined. https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/update-datasources
Not sure if i explained it right. I have a existing dataset for which i would add a calculated table. after adding calculated table, when i publish the data set it is publishing as a new dataset and not overwriting the already available dataset even though the names are same. if this is expected behaviour, rewiring the reports will make the report id change?
Hi @Anonymoous11, we work with PBI datasets on a daily basis, updating and expanding upon them: for instance by adding a calculated table to an existing PBI dataset. This way of working is also called the 'Golden Dataset' principle, and will not change IDs of dataset or linked reports.
If you are working in the original file (.pbix) of the dataset, and publish this accordingly, the expected behaviour is that the dataset in the PBI Service will be updated (you will also see a warning in PBI Desktop when publishing notifying you of this).
If the above does not help, could you provide more information on the steps you are actually performing when extending upon an existing dataset?
I have already posted a thread with the issue i am facing. Hope this explains my scenario more.
Please let me know if you find a solution to the above mentioned problem
I responded to the link above - basically it's what I said: this feature allows you to extend / enrich a dataset and publish it as a new artefact. It's not the intent of this feature to overwrite the existing dataset.
For us, this feature is very promising and we have taken steps to implement it within our enterprise (5000+ employees with Pro licenses). However, we ran into the limitation that all Viewers of reports that are developed using this feature must have Build permissions on all underlying datasets, and this is very troubling for us. Giving out Build permissions to employees who only (based on their function, and/or data-literacy level) require to view reports poses a high risk for us.
Currently we are urgently looking for a work-around or alternative solution to offer Customizable managed self-service BI to employees with Power BI within our the enterprise, as advocated in this Microsoft blog. Is there any update on the limitation that Build rights are always needed with the "DirectQuery for Power BI datasets and Azure Analysis Services" feature?
This is the way it used to be for us, but they did change it to only require viewer permission a little while ago. Maybe this is only for premium licensing? we have premium. But we don't have to add them as a build access anymore to view a report that is using direct query for power bi dataset. Now, if they want to analyze in excel and build off the dataset, then they need build access.
Hi @jeroenterheerdt and @mamsteroonie, thank you both for your information! We are indeed aware that within Premium/PPU licencing this issue is resolved. However, we have invested majorly in taking our full organisation to the next data-driven level by providing Power BI Pro licenses for all (5000+) employees only one year ago, so for us currenlty moving on to Premium/PPU licencing is not an option.
@jeroenterheerdt I feel very optimistic hearing that Microsoft are actively working on removing the Build limitation for Pro users. This could mean that simply 'wait and work with the current limitation' might be a valid option for right now. Could you provide any time of timepath for this limitation to be fully dealt with for Pro users? Could we expect this within the current calendar year (2023) perhaps?
Unfortunately I cannot say any such thing in this public forum, but we are working on a blog post that should shed some light on what you're asking. (By the way, congrats on the 500th message in this thread!)
It has come to our attention that this issue has been solved! I have not found any official Microsoft communication regarding this, but we discovered it ourselves and found a blog post by A Guy In A Cube comfirming this (start at 5:23):
@jeroenterheerdt Would you happen to know if there is any documentation/release notes available somewhere on this?
this has indeed not been resolved for all workspaces / customers. We are still working on it. We expect more news in April.
I made a connexion with DirectQuery for Power BI datasets.
It's fantastic, very useful and powerful.
But, and it's a big BUT, once the connexion is made, there is no way to remove it!
Please, could you solve that ?
Hi @florenti thank you very much for you feedback and for trying this out! I am glad you like it! Regarding your question, I assume you have been looking to find the connection in the Power Query / Data Transformation window. It is by design that you will not find the connection there - as written in our documentation. However, you will be able to delete the connection using the Data Source connection dialog (Transform Data --> Data Source settings), in which you can also update the connection info if required.
Please let me know if this helps or if you are seeing anything else.
Exactly my question! This was first published as a biggest thing in Power BI world at the time but now I hear nothing about this and not even a best guess for GA. It is annoying to know, that this feature is available but only in theory. There were so many bugs when testing this, that I am waiting for GA to continue any work with this.
Five days from the two-year anniversary of "a milestone in business intelligence."
I doubt that the mainstream media will recognize the importance of such innovation. But users that create and consume reports every day will immediately realize the impact of this change.
It is here, and it just works.
I would like to chime in by saying I support Microsoft in not making a feature GA until it is truly ready for prime time. There has been bugs yes but to their credit they have not released to GA prematurely and they are throwing fixes at the bugs quite quickly.
Secondly, this is a very difficult thing to implement technically.
The feature looks to be largely implemented via extensions to the DAX language itself and implements relationships between the local and remote model by placing large lists of data inside DAX statements in dynamic sets.
Finally, if you are an enterprise model designer, it's important to understand how this feature works under the hood to help you decide whether or not you should use it in your particular scenario, taking into account its features and limitations - and other factors such as the size of your model. In some cases, extending a single, base model - is going to provide a far superior solution than composite modelling.
There is a setting DiscourageCompositeModels that can be used to control whether composite modelling is permitted at a dataset level.
thanks all, we are not quite ready yet for prime time (or making this generally available). Yes, it takes a long time (longer than any of us want) but we are dependent on infrastructure changes that are taking longer than expected before we can declare GA.