Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
RuiRomanoMS
Microsoft Employee
Microsoft Employee

Share your thoughts on Power BI Project files (PBIP) and Fabric Git Integration

Hit Reply to tell us about your experience with Power BI Project files (PBIP) and Fabric Git Integration for Power BI Semantic Models and Reports so we can continue to improve.

For example:

  • What changes would you like to see?
  • What are your main challenges/issues?
  • Any suggestions for additional settings or capabilities?

Thanks,

-Power BI team

25 REPLIES 25
ThePowerOfBI
New Member

Hello

I like to see that I can use the PBIP file on my onedrive as it easier to share and more secure include versioning as well by SharePoint. In the moment, the PBIP file is not showing up for using it/open it...

I also like to test my PBIP I downloaded from Internet for scheme/design template.
And I like to have a better documentation for getting more the idea how I can use PBIP as theming environment.
Thanks in advance 😉

Cheers 

Michael

ChristopheB
Frequent Visitor

I downloaded a pbix-report from a workspace (live connection to the dataset).
When I save this report as pbip, and then re-open it, I always get an error that the model cannot be loaded. Even when I re-connect this report to the online dataset, it still gives the same error after re-opening.

 

Is it not possible to save a report as pbip when it has a live connection to a dataset? 

ChristopheB_0-1718875022579.png

 

jol
Frequent Visitor

Hi Rui and team,

 

I've discovered a potential issue with reports saved using PBIP (this doesn't appear to be specific to the new PBIR format, it also applies to PBIR-Legacy).

 

I have a thin (live-connected) report that contains some report-level measures. When I save it as PBIP (PBIR or PBIR-Legacy), the "definition.pbir" file contains a "byConnection" dataset reference and not a "byPath" dataset reference. This is expected.

 

I then manually change the "definition.pbir" file to use a "byPath" dataset reference, pointing to the local copy (saved as PBIP) of the same semantic model that it is currently live-connected to in the service. Note that I am not currently using Fabric Git Integration. But I understand that this happens automatically when you sync a thin report to git, if it belongs to the same workspace as the semantic model it is live-connected to.

 

I now open the report in Power BI Desktop. As expected, it now opens as a report with a local editable semantic model, rather than as a live-connected report. This is potentially a really convenient new way to work on thin reports during stages of development where the semantic model is not yet fully stable.

 

In the Table, Model, and DAX Query views, the report-level measures don't show up. This makes sense, as these measures are not part of the semantic model.

 

In the Report view, the report-level measures do show up. However left-clicking one of these measures does not bring up the Measure tools ribbon tab or the DAX editor, as it does for model measures. Even worse, right-clicking one of these measures brings up an error message:
jol_0-1718803009396.png

 

It would be great to have the ability to edit report-level mesures within reports that are open with a local model. It would be even better to be able to create new report-level measures, and even move measures back and forth between the report and the model.

 

However as a first step, it would be good to have it fail gracefully with a more informative error message!

 

Keen to hear what your thoughts are about this particular scenario. It appears to be an unforeseen consequence of allowing thin/live-connected reports to become paired with a pre-existing local model (this is clearly not a problem when you create a brand new local model with a DirectQuery connection to the original model, as any report-level measures can just become model measures in the new local model).

 

Might also be useful to know what Microsoft's plans are with report-level measures in general moving forward:

jol
Frequent Visitor

Already talked to Rui about the new Power BI enhanced report format (PBIR) on the recent blog post, but thought I'd post here as well.

I've created two Ideas that you should all vote for! They both propose that user code within a report definintion should be stored outside of the standard PBIR JSON files, so that they can be directly edited offline without having to worry about escape characters. I understand the first one (report-level measures in TMDL files) is already on the backlog, but will likely be released post-GA.

Serialize report-level measures into one or more TMDL files when saving a report using the Power BI ... 

Serialize visual custom code (R/Python script visual, Deneb, etc.) as standalone files when saving a... 

 

AkhilAshok
Solution Sage
Solution Sage

I’ve encountered an issue in PBIR when attempting to group bookmarks. It appears that not all metadata is being captured correctly; only ‘children’ details are retained while other relevant information is omitted.

Thanks for reporting. Its a bug and a fix will roll out in the next few days.

RogeroWijshombr
Regular Visitor

 

I would like to be able to have git integration included in Power BI Desktop for easier colaboration.

With that said exclusive lock rights would then be super awesome so you can not work on the same file with multiple people at once.
And it would be nice if it would be possible turn off the creation of the cache.abf in pbip because it generates a lot of storage space on a shared Teams/OneDrive location(because of the automatic versioning).

mike_honey
Memorable Member
Memorable Member

I noticed that after I save a PBIX file in PBIP format, the pop-up accessed from the top of the Power BI Desktop window frame still shows ".pbix" next to the file name.

As this is the only place I am aware of to get this info ("which file do I have open?"), it seems important to get it right.

 

mike_honey_0-1711411788344.png

 

We fixed this in March release, with the new developer mode flyout. Any thoughts/feedback on that?

Yes that looks great - thanks. 
But now you've raised the bar, and the old flyout (for PBIX files) is missing a link to the Location 😄

mike_honey
Memorable Member
Memorable Member

I have a fairly large PBIX (150MB) which is also fairly complex (110+ tables, 40+ pages). On my laptop it opens in 3 minutes from PBIX format, but 4 minutes from PBIP+TMDL format.

Not a showstopper by any means, but perhaps a step in the wrong direction. I imagined it would be a bit faster in PBIP+TMDL format, as no need to unzip a large file at the start, parallel I/O etc.

We are aware of some performance issues on open/save of PBIP vs PBIX and will improve it in upcoming releases to be as fast or faster than PBIX. The reason PBIP is still as slow as PBIX, its because of the save of the cache.abf file.

 

Thanks for the feedback.

mike_honey
Memorable Member
Memorable Member

Very happy to see the product moving in this direction, I've been using external tools to do similar things for a while but always prefer supported formats, should also encourage further external tool development.
I have bumped up against the Windows path limit (256 characters by default), when saving a PBIX as PBIP format. One trigger was the lengthy subfolder name for a Custom Visual, including a GUID. But IAC this seems a risk for any editing work using this format, e.g. add a new table with a longer table name and blow the limit, saving will crash.
Can you add some logic to Power BI Desktop to detect and avoid these issues?

Thanks for the feedback. In your opinion what would be the best experience? Block you from creating the table because it will generate a long path? Today we block on save, that let you save to another location if you wish.

I think your current functionality is the most practical.

DevopsPls
Frequent Visitor

Continuous deployment from Azure Devops (with PBIP) to dev workspaces is the most important planned feature for me.

 

I see there is some progress there with the new API and pipeline scripts, but I am unwilling to implement preview features across the enterprise.

jmarciogsousa
Frequent Visitor

  • Hi,

    Congrats for the solution.
  • Could be great one interface or VS Code extention that solves are the project relations and dependencies automatic like we have the VS Code MS SQL project.
Spoiler
For my use case I would like to create one dataset and publish programatic using pbip template/base project without use the powerbi app or workspace.
  • Could be great improve the doc to expalin the logic for the TMDL Language, we have for example "LocalDateTable_caedc611..." that are created in definition that and we don't have documentation for understand.

jmarciogsousa_0-1708598818761.png

Or The defenition of columns notation and options:

jmarciogsousa_1-1708598890940.png

The doc available don't provid it:

https://learn.microsoft.com/en-us/power-bi/developer/projects/projects-overview
https://learn.microsoft.com/en-us/power-bi/developer/projects/projects-dataset


Thanks for the feedback.

About PBIP deployment, you may use the new Fabric APIs:

https://learn.microsoft.com/en-us/rest/api/fabric/articles/item-management/item-management-overview

Example using PowerShell: https://github.com/microsoft/Analysis-Services/tree/master/pbidevmode/fabricps-pbip

 

About the local date tables, they are not specific to TMDL. They show up in your model due to the AutoDatetime feature enabled for your model: https://learn.microsoft.com/en-us/power-bi/transform-model/desktop-auto-date-time

 

 

LocalDateTable entry are an indication that you forgot to disable Auto Date/Time. They should never be part of an enterprise semantic model.

UNIheve
Frequent Visitor

Hi

We have a lot of reports connected live to on-prem ssas (Fabric is on the road map)

I would like to add the report-part of these files into source control, and I think I succeded with it yesterday.

I save the thin report as pbip > commit to devops repo > connect from workspace to repo.

But trying the setup from scatch again today, when I click on Update all in the workspace I just get: Failed to discover dependencies ... Azure Analysis Services and SQL Server Analysis Services hosted semantic models are not supported....


Is it possible?

@RuiRomanoMS 

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

Power BI Carousel June 2024

Power BI Monthly Update - June 2024

Check out the June 2024 Power BI update to learn about new features.

PBI_Carousel_NL_June

Fabric Community Update - June 2024

Get the latest Fabric updates from Build 2024, key Skills Challenge voucher deadlines, top blogs, forum posts, and product ideas.