Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
08-24-2025 16:50 PM - last edited 08-26-2025 10:50 AM
Business Need: Add commentary to live reports easily
As a financial modeling and BI analyst, I serve business users who want to indicate the effect of market and external factors on some financial and operational metrics. It could be mentioning the impact of tariffs on 2025 cost of sales or that a lawsuit is the reason for a sudden stock price crash or a sudden change of vendor is the cause of a spike in inventory holding days.
Before Translytical: Exporting Power BI reports to PDF or PowerPoint to add those comments
Most business users end up doing this outside of Power BI. The few occasions we had tried to give them a way to put this comment into a DB has proven less instantaneous enough for the users, and they stopped using the DB based solution. They preferred to export the Power BI report to PDF or PowerPoint and then do their annotations and commentary.
With Translytical: Users can now add comments and instantly see it displayed in their reports
Translytical addresses the major need for instantly capturing the comments and unlimited updates to those comments. I find the business users of my reports feeling satisfied with the translytical based solutions.
This Showcase: Add live commentary to financial reports on Power BI
We will be building a solution that allows business users to add the comments they want to about different financial metrics they track for their investment monitoring and decision-making needs. The code and reports are deliberately made simple and easy to replicate so that you can see how easy it is to add these values without a lot of technical work.
Needed Resources:
You can easily use Dataflow Gen2 to load the two tables and generate the third table. I uploaded the file to a Lakehouse storage and then pointed Dataflow Gen2 to the file there. The M query codes are below:
Metrics Order Table
let
Source = Lakehouse.Contents(null),
Navigation = Source{[workspaceId = <your-workspace-id>]}[Data],
#"Navigation 1" = Navigation{[lakehouseId = <your-lakehouse-id>]}[Data],
#"Navigation 2" = #"Navigation 1"{[Id = "Files", ItemKind = "Folder"]}[Data],
#"Navigation 3" = #"Navigation 2"{[Name = "translytical - source tables.xlsx"]}[Content],
#"Imported Excel workbook" = Excel.Workbook(#"Navigation 3", null, true),
#"Navigation 4" = #"Imported Excel workbook"{[Item = "Metrics Order", Kind = "Sheet"]}[Data],
#"Promoted headers" = Table.PromoteHeaders(#"Navigation 4", [PromoteAllScalars = true]),
#"Changed column type" = Table.TransformColumnTypes(#"Promoted headers", {{"Metric", type text}, {"Position", Int64.Type}})
in
#"Changed column type"
Company Metrics Table
let
Source = Lakehouse.Contents(null),
Navigation = Source{[workspaceId = <your-workspace-id>]}[Data],
#"Navigation 1" = Navigation{[lakehouseId = <your-lakehouse-id>]}[Data],
#"Navigation 2" = #"Navigation 1"{[Id = "Files", ItemKind = "Folder"]}[Data],
#"Navigation 3" = #"Navigation 2"{[Name = "translytical - source tables.xlsx"]}[Content],
#"Imported Excel workbook" = Excel.Workbook(#"Navigation 3", null, true),
#"Navigation 4" = #"Imported Excel workbook"{[Item = "Company Metrics", Kind = "Sheet"]}[Data],
#"Promoted headers" = Table.PromoteHeaders(#"Navigation 4", [PromoteAllScalars = true]),
#"Changed column type" = Table.TransformColumnTypes(#"Promoted headers", {{"ID", Int64.Type}, {"Company", type text}, {"Check", Int64.Type}, {"Metric", type text}, {"Year", Int64.Type}, {"Value", type number}, {"Comment", type text}}),
#"Removed columns" = Table.RemoveColumns(#"Changed column type", {"Comment"}),
#"Inserted Merged Column" = Table.AddColumn(#"Removed columns", "RelationshipKey", each Text.Combine({[Company], " ", [Metric]}), type text)
in
#"Inserted Merged Column"
Company Comments Table
let
Source = Lakehouse.Contents(null),
Navigation = Source{[workspaceId = <your-workspace-id>]}[Data],
#"Navigation 1" = Navigation{[lakehouseId = <your-lakehouse-id>]}[Data],
#"Navigation 2" = #"Navigation 1"{[Id = "Files", ItemKind = "Folder"]}[Data],
#"Navigation 3" = #"Navigation 2"{[Name = "translytical - source tables.xlsx"]}[Content],
#"Imported Excel workbook" = Excel.Workbook(#"Navigation 3", null, true),
#"Navigation 4" = #"Imported Excel workbook"{[Item = "Company Metrics", Kind = "Sheet"]}[Data],
#"Promoted headers" = Table.PromoteHeaders(#"Navigation 4", [PromoteAllScalars = true]),
#"Changed column type" = Table.TransformColumnTypes(#"Promoted headers", {{"ID", Int64.Type}, {"Company", type text}, {"Check", Int64.Type}, {"Metric", type text}, {"Year", Int64.Type}, {"Value", type number}, {"Comment", type text}}),
#"Removed columns" = Table.RemoveColumns(#"Changed column type", {"Value", "Year", "Check"}),
#"Grouped rows" = Table.Group(#"Removed columns", {"Company", "Metric", "Comment"}, {{"ID", each List.Max([ID]), type nullable Int64.Type}}),
#"Reordered columns" = Table.ReorderColumns(#"Grouped rows", {"ID", "Company", "Metric", "Comment"}),
#"Inserted Merged Column" = Table.AddColumn(#"Reordered columns", "RelationshipKey", each Text.Combine({[Company], " ", [Metric]}), type text)
in
#"Inserted Merged Column"
Set the Dataflow Gen2 to load the tables into a Fabric SQL database. In my case I had created a Fabric SQL Database called translytical_sql so I simply pointed the Dataflow Gen 2 destination settings to that SQL DB and that it creates a new table for each.
Now that the needed source tables are set up, we can move on to creating the user data function. After creating the function, ensure you add the SQL DB connection via managed connection and then you can use my code below.
Don't bother trying to rename the default function that shows when you click "Add Function", it will change once you rename what's right after def in your code (the Python function name). Ensure the alias is same as your managed connection alias (see image above for mine).
#insert_comment function
import fabric.functions as fn
udf = fn.UserDataFunctions()
@udf.connection(argName='sqlDB', alias='translyticalsql')
@udf.function()
def insert_comment(sqlDB: fn.FabricSqlConnection, metricId: int, metric: str, company: str, comment: str) -> str:
'''
Description: Inserts a comment into the SQL table of the financials.
Args:
sqlDB (fn.FabricSqlConnection): Fabric SQL database connection.
metric_id (int): metric id (primary key).
comment (str): the comment to add
Returns:
str: Confirmation message about data insertion.
'''
# Set up the connection to the SQL database
connection = sqlDB.connect()
cursor = connection.cursor()
# Insert the comment into the company metrics table
add_comment_query = "UPDATE [dbo].[CompanyComments] SET [comment] = ? WHERE [ID] = ?;"
cursor.execute(add_comment_query, (comment, metricId))
# Commit the transaction
connection.commit()
# Close the connection
cursor.close()
connection.close()
return f'Financial metric "{metric}" for company "{company}" was updated with the following comment: {comment}'
With the function done, you can now move to the final part: doing the Power BI report. I have attached the actual Power BI report. The major things to remember are:
And that's all! You can test on your Power BI desktop app before publishing to the web service.
Enjoy your first/new business relevant translytical!
Amazing work Michael