Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
ThomasDay
Impactful Individual
Impactful Individual

How do you document your models?

Hello all,

 

I can see that my models will get complex fast--and code is very decentralized--being embedded in Queries, column adds, and measures (more maybe when you also look at visualizations)

 

How do you all document your code...perhaps make a data dictionary of computed cols and measures and the like?  Is there anything that auto does that?

 

Thanks,

Tom

PS: was just reading about corruption in models on this forum and it has me want to be smart, presuming that possible (on any level)

2 ACCEPTED SOLUTIONS
KHorseman
Community Champion
Community Champion

My documentation style is pretty low tech as these things go. Nothing automated but it's easy enough.

 

I write most of my columns and measures out in Notepad++ and then copy it into Power BI. The advantages here are 1) you can format your code nicely without trouble and 2) your documentation is half-done already. Especially if you comment your code as you go.

 

Beyond that it's mostly a matter of organizing the code and documenting the queries. For the queries, open the advanced editor and copy all that code. You can later recreate any query instantly by pasting that back into a blank query*.

 

For the documentation itself, my preference is to organize by tables. A brief description of the data represented, the query, a list of the columns and their data types, then the custom columns in alphabetic order and the measures, same.

 

I tend to create master datasets and use them each for several different reports, so I tend not to document the reports themselves very extensively. I just keep a changelog of requests that I get about them, because everyone around here likes to change their minds about what charts they want to see. One changelog per report, starting with a note about which dataset it's coming from.

 

I've had to recreate one lost report so far. The original took about a week, the recovery took about three hours, and to be fair I was watching a movie while I did it.

 

*Assuming in some cases that you've already established a connection and credentials with the data source from the machine you're working at.

 

 





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




View solution in original post

How to connect PBI to DAX Studio: http://exceleratorbi.com.au/getting-started-dax-studio/

incl. tons of other awesome stuff you can do with it!

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

View solution in original post

24 REPLIES 24
AnonymousPerson
Advocate V
Advocate V

These tools all stink. Just roll your own with the DMVs. It's a fun day's work.

clausm73
Helper III
Helper III
ThomasDay
Impactful Individual
Impactful Individual

Interesting...I'll take a look.  I'll be especially interested in learning about the Azure Data Catalog.  It's surfaced in a few recent presentations by the Microsoft crowd.  Thanks, Tom

ImkeF
Super User
Super User

Using Dax Studio to extract the measures: http://exceleratorbi.com.au/extract-calculated-fields-from-an-excel-workbook/

 

To extract all Queries at once, start sending a frown but don't send the mail:

http://www.thebiccountant.com/2016/01/30/visualize-query-dependencies-in-power-query-and-power-bi/

This also describes a technique that helps you visualize dependencies between complex queries.

 

If you have very many different measures and you want to visualize their dependencies under each other as well, you can use Sankey-diagramm as well: http://www.thebiccountant.com/2015/12/18/visualize-dependencies-between-your-dax-measures/

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

@ImkeF just what i was looking for the Sankey chart looks like a perfect way to document both PQ queries and measures. Along with DAX studio to document tables and measure formulas this should save loads of time. Thanks a million.   🙂

@DavidMoss Thanks David.

For the other readers: Link to Sankey for PQ queries: http://www.thebiccountant.com/2016/01/30/visualize-query-dependencies-in-power-query-and-power-bi/

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Hi @ImkeF i was trying to use your Power Query Management Studio solution yesterday but i encountered the following errrors. Can you please advise where i might be going wrong or if there is a work around.

 

I followed the instructions on that blog and when i click RefreshAll in your excel file i get an error ….”The connection could not be refreshed…..” and then another error stating “The following data range failed to refresh ExterneDaten_1” …..any suggestions ???

edit 01-Sep-2016: Simplified code

 

Hi David,

thanks for reporting the problem, but I cannot replicate it and couldn't find the reason.

 

And as the detour via Excel also is not very elegant and it is now very easy to copy the query-code, here comes a new approach. Just one query for PBI:

 

let

// ToDo: Copy the queries from the query-pane, click the gear-wheel icon near the query name "Source" in the Applied steps, remove the existing content and paste into the table. This will then be automatically translated into the JSON-string.
    ClickGearWheelClearAndPaste = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W0tdXCCxNLao0VIrViVbKSS0B0wpAEJxfWpScqmCrAJHKzEOTAXPBBMwMI9xmINlBhEHG1DLIhIBBxtpIDsdiXiwA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Column1 = _t]),
    #"Changed Type" = Table.TransformColumnTypes(ClickGearWheelClearAndPaste,{{"Column1", type text}}),

// Identifying query names
    #"Trimmed Text" = Table.TransformColumns(#"Changed Type",{},Text.Trim),
    #"Cleaned Text" = Table.TransformColumns(#"Trimmed Text",{},Text.Clean),
    #"Filtered Rows" = Table.SelectRows(#"Cleaned Text", each ([Column1] <> "")),
    #"Added Index" = Table.AddIndexColumn(#"Filtered Rows", "Index", 0, 1),
    LetIn = Table.AddColumn(#"Added Index", "LetIn", each if [Column1]="let" then 1 else if [Column1]="in" then -1 else 0),
    Level = Table.AddColumn(LetIn, "Level", (this)=>List.Sum(Table.SelectRows(LetIn, each [Index]<=this[Index])[LetIn])),
    QueryName = Table.AddColumn(Level, "QueryName", (this) => try if this[Index]=0 then this[Column1] else if Level[Column1]{this[Index]-2}= "in" and Level[Level]{this[Index]-2}=0 then this[Column1] else null otherwise null),
    #"Filled Down" = Table.FillDown(QueryName,{"QueryName"}),
    Replace = Table.Buffer(Table.ReplaceValue(#"Filled Down","// ","",Replacer.ReplaceText,{"QueryName"})),
    #"Added Custom" = Table.AddColumn(Replace, "Custom", each [Column1]),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Added Custom","Custom",Splitter.SplitTextByEachDelimiter({"="}, QuoteStyle.Csv, false),{"Custom.1", "Custom.2"}),
    #"Removed Columns" = Table.RemoveColumns(#"Split Column by Delimiter",{"Custom.1"}),
    #"Filtered Rows1" = Table.SelectRows(#"Removed Columns", each ([Custom.2] <> null)),
    #"Added Custom1" = Table.AddColumn(#"Filtered Rows1", "Custom", each #table({"Column1"}, {{[Custom.2]}})),
    #"Added Custom2" = Table.AddColumn(#"Added Custom1", "Custom.1", each List.Select(Table.Transpose(Table.SplitColumn([Custom],"Column1",Splitter.SplitTextByAnyDelimiter({"+", "-", "/", "*", "(", ")", "}", "{", "[", "]", "#(lf)", "#(cr)", ","}, QuoteStyle.Csv)))[Column1], each _ <>"")),
    #"Removed Columns1" = Table.RemoveColumns(#"Added Custom2",{"Custom"}),
    #"Expanded Custom.1" = Table.ExpandListColumn(#"Removed Columns1", "Custom.1"),
    #"Replaced Value" = Table.ReplaceValue(#"Expanded Custom.1","#","",Replacer.ReplaceText,{"Custom.1"}),
    #"Trimmed Text1" = Table.TransformColumns(#"Replaced Value",{{"Custom.1", Text.Trim}}),
    Steps = Table.TransformColumns(#"Trimmed Text1",{{"Custom.1", Text.Clean}}),
    DistQueryNames = Table.Distinct(Replace, {"QueryName"}),
    TblQueryReferences = Table.NestedJoin(Steps,{"Custom.1"},DistQueryNames,{"QueryName"},"NewColumn",JoinKind.Inner),
    #"Added Custom3" = Table.AddColumn(TblQueryReferences, "Filter", each if [QueryName]=[Custom.1] then "Out" else "In"),
    #"Filtered Rows2" = Table.SelectRows(#"Added Custom3", each ([Filter] = "In")),
    QueryReferences = Table.RemoveColumns(#"Filtered Rows2",{"NewColumn"}),

// Prepare ParentChildTable
    StandaloneQueries1 = List.Difference(List.Distinct(Replace[QueryName]), List.Distinct(QueryReferences[QueryName])),
    StandaloneQueries = Table.FromList(StandaloneQueries1, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    RenameStandalone = Table.RenameColumns(StandaloneQueries,{{"Column1", "Child"}}),
    Rename = Table.RenameColumns(QueryReferences,{{"Custom.1", "Parent"}, {"QueryName", "Child"}}),
    #"Removed Other Columns" = Table.SelectColumns(Rename,{"Child", "Parent"}),
    #"Removed Duplicates" = Table.Distinct(#"Removed Other Columns"),
    ChildParentTable = Table.Buffer(Table.Combine({#"Removed Duplicates", RenameStandalone}))
in
    ChildParentTable

 

 

It comes with some sample data included in step "ClickGearWheelClearAndPaste" already so it will return a table already that produces a sankey-diagram.

In order to fill it with the code of your queries, you just check your query names in the editor and copy. Then edit "ClickGearWheelClearAndPaste"-step by first deleting the existing content and then paste. Save&load and the new Sankey should appear. Have a look at the video: https://youtu.be/8QEuuO8mbKU

 

But be aware: Don't copy this code using one of Microsoft's browsers!!! It will break and return crazy error-messages!! (With almost all M-codes in this forum software unfortunately) I haven't had problems with Firefox.

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Wow.....Will try that @ImkeF ...but with 20 seperate queries rather than opening each in the advanced editor and copy & paste I assume I can do the "send a frown" and copy the M code section from that email ??? Hopefully.......Will try and let you know how i get on. Thanks for your continued support.   

Hi David,

no, unfortunately send frown will not work with this one. The syntax of the query names is different.

But it's much easier than you think: Check your first query -> press Shift -> check last query -> rightclick mouse -> copy:  done.

Much quicker than send frown.

Or If you have organized them in folders:: Just check the folders names and the copy will transfer all of them. No need to care about dependent queries - thay will come with it automatically.

This is btw also a very convenient method to copy queries across workbooks - also between PBI and Excel: Just instead of pasting them into a table, paste them into to editor. 

 

No need to open the adanced editor at all (apart from pasting the query above that does the job 🙂

 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

@ImkeF I finally got around to testing your solution out. Thats some amazig code...thanks.

YOu just have to pay attention to the cut and pasting into the ClickGearWheelClearAndPaste applied step. (i see your code also does the demoted headers so we shouldn't do that step when copying and pasting drives your copied text to the header).

 

I see in your video that you cut and pasted the new Queries in what appeared Excel Power Query not from a Power BI EDit Queries UI.

I have just tried it from the Power BI UI and it is working. WOW

 

Only thing is most of my model is data connections only which are then appended into one big Master FAct table, so it is just showing the last append and the fact tables for some reason.

 

Have you used your solution on data connections only ?

 

BUT an amazing solution...I'd give you 10 votes if i could for this solution.

Hi David, thanks!

 

What exactly do you mean with "dataconnections only"?

What does step "QueryNames" return?: Do you see the names of your "dataconnections only" queries in column "QueryName"?

 

Re your warning on the "Demoted Header": Good catch, but check out the next step "Filtered Rows 3": There I cater for both cases: First line of code in header or not. So you can paste in 2 different ways without running into a problem here.

 

 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

@ImkeF Apologise for the fact that this dialogue continues just i think i am close to resolving this and want to persist to benefit from your hard work and hopefully your satisfaction.

 

By 'Data connections only' what i mean is that the query does not load the data to the data model but makes a connection only.

 

I often use this technique to prepare connections to data without actually loading the data when for example conneccting to multiple sources of data from different sources but with a similar data schema. Then with all the 'dataconnections only' prepared and tested i then 'Append' all the connections together to make 1 large fact tablewhich is loaded into the datamodel.

Thus i am thinking your code only sees the final fact table and not all the 'dataconnecctions only'.

 

What i hoped for was a sankey that showed the flow of queries from ALL the 'data connections only' into the fact table.

 

Looking at the code it is the 'parent' that is creating the issue as it returns a single parent (the last append table) for the fact table.

 

Anyway this is getting too complicated...if you have the patience then maybe we should take this offline ?
David

Hi!

 

Old but still interesting topic.

From my recent research with google, i could see these 3 ways to document a PBI project

- DataVizioner https://app.datavizioner.com/file

- DAX Studio https://exceleratorbi.com.au/getting-started-dax-studio/

- Azure Data Catalog https://docs.microsoft.com/en-us/azure/data-catalog/data-catalog-dsr

 

All above are to some extend interesting.

What i am still missing at the end is a real, simple possibility to comment the tables, M and DAX codes and then having a software that extract it in a nice manner. For example, if in DAX i would comment with

//.doc This measure is calculating the turnover by day

// 1234

The software should be able to keep the real comment (".doc.") and not consider the other one.

The idea would be that you write directly in the PBIX file your comments/properties, and then the software is able to export it directly either to Excel, Word or PDF to document it nicely.

 

What do you think about it? Does it make sense?

Do you have a better solution?

 

BR, G.

@ImkeF The extract the query by sending a frown = Awesome!


Looking for more Power BI tips, tricks & tools? Check out PowerBI.tips the site I co-own with Mike Carlo. Also, if you are near SE WI? Join our PUG Milwaukee Brew City PUG
KHorseman
Community Champion
Community Champion

My documentation style is pretty low tech as these things go. Nothing automated but it's easy enough.

 

I write most of my columns and measures out in Notepad++ and then copy it into Power BI. The advantages here are 1) you can format your code nicely without trouble and 2) your documentation is half-done already. Especially if you comment your code as you go.

 

Beyond that it's mostly a matter of organizing the code and documenting the queries. For the queries, open the advanced editor and copy all that code. You can later recreate any query instantly by pasting that back into a blank query*.

 

For the documentation itself, my preference is to organize by tables. A brief description of the data represented, the query, a list of the columns and their data types, then the custom columns in alphabetic order and the measures, same.

 

I tend to create master datasets and use them each for several different reports, so I tend not to document the reports themselves very extensively. I just keep a changelog of requests that I get about them, because everyone around here likes to change their minds about what charts they want to see. One changelog per report, starting with a note about which dataset it's coming from.

 

I've had to recreate one lost report so far. The original took about a week, the recovery took about three hours, and to be fair I was watching a movie while I did it.

 

*Assuming in some cases that you've already established a connection and credentials with the data source from the machine you're working at.

 

 





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




kcantor
Community Champion
Community Champion

For now, I keep multiple copies of the pbix file in different locations. When I make changes I will only save it to one location to make sure that the file is good. After a couple of days or weeks, I update the other versions of the file. I have learned to always have a backup. If one craps out, I simply fall back to a back up and rewrite the last couple of calculations.

EDIT: Posted this on the wrong thread.

My models are documented using the DaxStudio to create an expression dictionary that is save separately from the file.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




ThomasDay
Impactful Individual
Impactful Individual

These are an absolute treasure trove of good stuff.  Two quick questions:

  • Is DAX Studios for Excel only--and it seemed to be looking for SQL server on install.  If it can be used for Power Bi, is it a stable product--enough so that you do your coding there and then push it or some such thing.  What a great project--do they keep up with PowerBi's dev pace?
  • And send the frown...ha!  Very cool!!

Tom

PS: the notepad++ is a very ingenious solution.  I use that already and should have figured that out.  Well done everyone and many many thanks

 

How to connect PBI to DAX Studio: http://exceleratorbi.com.au/getting-started-dax-studio/

incl. tons of other awesome stuff you can do with it!

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Awesome tool, thanks for the recommendation.  Just installed it with ease.  Trying to output my measures to Excel but I don't have that option, only Grid, Timer and File.  Any idea how to activate output to Excel?

 

Also, is there any way to sort fields in data view of PBI?   I have dozens of measures and the table seems to sort them in the order they were created not in alpha order.  The measures are sorted in alpha order on the report view but not the data view which makes it very hard to find and edit measures when I have so many.  Any suggestions?

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.