Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
Hit Reply to tell us what you think about the new Data Model Editing in the Power BI Service feature so we can continue to improve.
For example:
Thanks,
-Power BI team
To read more about the feature see the announcement in the Power BI Product Blog
Solved! Go to Solution.
I am glad to hear that you are excited to try this out! I suspect that the dataset you are trying to test falls under one of our limitations and that is why the 'Open data model' button is disabled. You can see a full list of our limitations for this experience here in our documentation: Edit data models in the Power BI service (preview) - Power BI | Microsoft Learn
To see which limitation your dataset falls under you can do the following steps:
Hover over the Open data model button in the dataset details page. This displays a tooltip indicating which limitation is causing the Open data model button to be disabled.
Thank you for sharing this feedback! This is really helpful as we plan and prioritize improvements to this experience.
Like I mentioned before and as I read what GregW asks for, is to be able to open the data model, make multiple changes and then Save or Discard the changes just like in 99% of Microsoft's applications and forms. If you insist on Auto-saving as is, then at least provide an undo functionality.
Thank you for providing this feedback! With an auto-save experience would you be more concerned about wanting to undo changes made in the same editing session or across several editing sessions?
It would be great if git integration could be used for this. Allow a pr to be staged on the changes and for the user to choose to push or not.
Sadly for our org we have multiple tenants involved and can't use git integration for workspaces until enhancements are made to better support complex enterprise orgs.
Thank you for this feedback, this is helpful to know what scenarios are most important to users like you! What enhancements are needed to git integration for workspaces in order for your org to adopt using this capability?
My ADO and PowerBI are in different tenants. I cannot sync a workspace in my premium capacity to a GIT repo tied to an ADO project in a different tenant. Currently, based on documentation and the testing I have done, the GIT repo and the PowerBI capacity have to be in the same tenant (I created a new ADO org in the same tenant as the power bi capacity and it worked, but we want our code in one place and need to be able to support power bi capacities in multiple tenants and creating pipelines to clone an ADO repo to a different tenant seems hacky at best). Large enterprises often end up with multiple tenants for a multitude of reasons and while I get it for initial release it would be really nice to support some of these complexities that come with scale.
The solution that comes to mind is allowing syncing on a git repo based on URL instead of the prepopulated drop down list.
This feature has significantly reduced the time I spend making modifications to my main DataSet file. I have 1 PBI dataset that houses all data transformations, calculated columns, and measures. This one file fuels 20+ reports. It is REALLY nice to make formatting changes right there in the serice and have my in progress report reflect that change within 5 minutes. It is also convenient to be able to view or modify my original formulas, without having a massive dataset file open on my PC then saving it/uploading it to see the result.
This change has pushed me away from relying on hard copy files and feeling confident that I can download a refreshed version of my dataset every morning and it has all the small changes anyone made in the service the prior day.
Like some others, I still like to maintain my own version history to keep record of large changes and have the ability to quickly revert them if necessary, since there is no way to test the service reports reaction to major changes aside from setting the whole thing live then having a pre-change version ready to deploy in the event certain visuals are no longer functioning. If PowerBI could store a version history of published datasets, similar to Sharepoint files, this would be very helpful. Add a version notation feature too, and my life would change.
We appreciate you taking the time to provide feedback! It brings us great joy to learn that the feature has considerably reduced the time you spend modifying your dataset. We hope it continues to save you time in the future as well! Additionally, thank you for sharing your thoughts on the need for version history for published datasets. Your feedback was very insightful as we are investigating options to help support these types of scenarios in the future. Please continue to let us know any additional feedback you have as you continue to use this feature!
Do you have any guidance for how we can merge changes made in the service back to our local pbix file without having to redownload the entire pbix from the service? Like Kgirgenti1795 we have one or two large datasets as .pbix and downloading them takes a long time. If we can just merge changes and then check our local copy back into source control that would save us a lot of time.
Thank you!
I would recommend trying out using Git integration with Power BI Project file type (PBIP) in order to accomplish these source control needs. More information can be found here in our documentation. Please let us know if you have any feedback on Power BI Desktop projects git integration!
I still have to test this out some more, but I like that the ability to edit the data model online is there now. I am the only one at my company that works on PowerBI reports and models and after a report review, or just randomly, I have gotten requests in the past which require me to create new columns, measures, or to query entirely new tables from a database and build the PowerQuery logic. When this happens, I have been saving a .pbix of the current state of the online model/report as a backup then I use a copy of that download to do any changes I need in the desktop version then I republish the model/report with a new version number and get rid of the old one. I don't know if that was how it was intended, but that's the only way I could find to add things to my reports when I needed to change something in the dataset.
As for the feature, I could see using it when a time arises that I need to modify the dataset with minor changes. But anything major, I would likely still follow my old method. I'm also going to keep the option off until those times that I need it then just toggle it on. If I use the new feature, I would download the current state .pbix as a backup, then do the changes online, and if it breaks anything I could republish the initial state. If it didn't break anything I'd download the new .pbix as backup. I would really like to see support for PowerQuery in the online editor since most of what I do in my datasets is I create them in PowerQuery originally, whether it be from Excel or PowerBI (it's cool that they both support PowerQuery and queries can be copied between them, since many of my Excel-based reports, I use PowerQuery extensively for). I think PowerQuery seems like an under-utilized, but powerful tool for refreshable report building.
Thank you for taking the time to provide this insightful feedback! It is helpful for us to know that Power Query is something you use extensively and would like to see supported for data model editing in the web. It is also insightful to see that taking backups and copies of your dataset is something you regularly do today for your pbix files. Please stay tuned for additional updates and enhancements to data model editing in the web!
@emlisa, we have a very good use for this feature but it is limiting.
I am providing access to an open data model to my users who actively are interested in creating a report on their own in the service using published dataset. However, I am concerned that when users make changes using "Open Data Model" feature such as creating measures or columns, it affects the original data set instead of being limited to their local copy or file. I am wondering if this feature was specifically designed for BI developers. Can you please clarify this service for the open data model? This information will help us provide our users with a more accurate response.
@neelambari26 there is no "local copy". There is only one dataset and one version. Multiple users will write to the same dataset and potentially overwrite eachothers' work. However if they work on different measures they will be able to contribute to the same dataset.
Thank you for the feedback! This answer is accurate, there is one dataset and one version that multiple users can write to rather than local copies. Please keep in mind that changes will be permanent and automatically saved. We are currently investigating ways to allow users to revert to a previous point in time, feedback like this is helpful as we investigate future improvements!
Thank you for your feedback.
How to get back changes when you reopoen the pbix file with PBI Desktop?
Changes made in the Service are not automatically kept in sync with changes made in Desktop. You will need to download the pbix corresponding to your dataset in order to see modeling changes made in the Service on Desktop. Similarly you will need to republish your file from Desktop to the Service to see modeling changes you made in Desktop in the Service.
If possible, a sync warning will be great to avoid overwrites.
Thank you for the feedback! Improving warnings and guidance to prevent unwanted overwrites is something we are investigating for future improvements.
I found out that the online editing doesn't work when a dataset is in an embedded workspace, while it works for workspaces that aren't embedded. Is this known behaviour and can we expect this to change in the future?
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
34 | |
30 | |
18 | |
12 | |
8 |
User | Count |
---|---|
50 | |
35 | |
30 | |
14 | |
12 |