Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and a 50 percent discount on exams.
Get startedEarn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hit Reply to tell us about your experience with Power BI Project files (PBIP) and Fabric Git Integration for Power BI Semantic Models and Reports so we can continue to improve.
For example:
Thanks,
-Power BI team
I noticed that after I save a PBIX file in PBIP format, the pop-up accessed from the top of the Power BI Desktop window frame still shows ".pbix" next to the file name.
As this is the only place I am aware of to get this info ("which file do I have open?"), it seems important to get it right.
We fixed this in March release, with the new developer mode flyout. Any thoughts/feedback on that?
Yes that looks great - thanks.
But now you've raised the bar, and the old flyout (for PBIX files) is missing a link to the Location 😄
I have a fairly large PBIX (150MB) which is also fairly complex (110+ tables, 40+ pages). On my laptop it opens in 3 minutes from PBIX format, but 4 minutes from PBIP+TMDL format.
Not a showstopper by any means, but perhaps a step in the wrong direction. I imagined it would be a bit faster in PBIP+TMDL format, as no need to unzip a large file at the start, parallel I/O etc.
We are aware of some performance issues on open/save of PBIP vs PBIX and will improve it in upcoming releases to be as fast or faster than PBIX. The reason PBIP is still as slow as PBIX, its because of the save of the cache.abf file.
Thanks for the feedback.
Very happy to see the product moving in this direction, I've been using external tools to do similar things for a while but always prefer supported formats, should also encourage further external tool development.
I have bumped up against the Windows path limit (256 characters by default), when saving a PBIX as PBIP format. One trigger was the lengthy subfolder name for a Custom Visual, including a GUID. But IAC this seems a risk for any editing work using this format, e.g. add a new table with a longer table name and blow the limit, saving will crash.
Can you add some logic to Power BI Desktop to detect and avoid these issues?
Thanks for the feedback. In your opinion what would be the best experience? Block you from creating the table because it will generate a long path? Today we block on save, that let you save to another location if you wish.
I think your current functionality is the most practical.
Continuous deployment from Azure Devops (with PBIP) to dev workspaces is the most important planned feature for me.
I see there is some progress there with the new API and pipeline scripts, but I am unwilling to implement preview features across the enterprise.
Or The defenition of columns notation and options:
The doc available don't provid it:
https://learn.microsoft.com/en-us/power-bi/developer/projects/projects-overview
https://learn.microsoft.com/en-us/power-bi/developer/projects/projects-dataset
Thanks for the feedback.
About PBIP deployment, you may use the new Fabric APIs:
https://learn.microsoft.com/en-us/rest/api/fabric/articles/item-management/item-management-overview
Example using PowerShell: https://github.com/microsoft/Analysis-Services/tree/master/pbidevmode/fabricps-pbip
About the local date tables, they are not specific to TMDL. They show up in your model due to the AutoDatetime feature enabled for your model: https://learn.microsoft.com/en-us/power-bi/transform-model/desktop-auto-date-time
LocalDateTable entry are an indication that you forgot to disable Auto Date/Time. They should never be part of an enterprise semantic model.
Hi
We have a lot of reports connected live to on-prem ssas (Fabric is on the road map)
I would like to add the report-part of these files into source control, and I think I succeded with it yesterday.
I save the thin report as pbip > commit to devops repo > connect from workspace to repo.
But trying the setup from scatch again today, when I click on Update all in the workspace I just get: Failed to discover dependencies ... Azure Analysis Services and SQL Server Analysis Services hosted semantic models are not supported....
Is it possible?
We are working to support the AS/AAS scenario. Until then of all you want to source control the thin reports, if you live connect the report to the model in the service and not to Analsysis Services it should work because the connection in definition.pbir will be to the model in the workspace and not to Analysis Services. I believe the error you get its because the connection in definition.pbir is to the Analysis Services server. Can you confirm?
Great that you are working on it.
Yes, I can confirm. There is no model in the service, I can connect to, as the reports are connected through a gateway to on-prem SSAS. So I have saved the AS connected PBIX as a PBIP, and get the error when I try to sync from Azure DevOps repo into an empty workspace. The definition.pbir connectionString is pointing at the SSAS server.
Please allow an option for more granular serialization to represent measures as distinct TMDL files rather than part of a larger table definition. This makes maintaining measures simpler.
I currently do this with TE3 folder structure in JSON format and can't switch to TMDL until this level of granularity is possible.
Its in our backlog, but not a priority. Can you please log a new idea on aka.ms/fabricideas?
Hi @RuiRomanoMS ,
What changes would you like to see?
I hope the UI interface is more concise and clear, and the synchronization of Git integration can be more real-time and convenient
What are your main challenges/issues?
The synchronization steps of Git integration are a bit cumbersome, and sometimes there are some restrictions
Any suggestions for additional settings or capabilities?
I hope to broaden the compatibility and support for synchronization of certain unsupported workspace items (especially fabric-related items).
Best Regards,
Liu Yang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
18 | |
5 | |
2 | |
1 | |
1 |
User | Count |
---|---|
23 | |
4 | |
3 | |
2 | |
1 |