Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hello,
Looking to understand common approaches, considerations, and best practices to create and manage an enterprise function library. Ideally, Fabric Items/Power BI Desktop should be able to call scripts/functions for ETL. Functions would be a combination of:
These functions could be stored as txt DevOps, if required.
Solved! Go to Solution.
Hi @adudani ,
This is a critical architectural step for scaling analytics. You have correctly identified the gap: unlike Python or SQL, neither Power BI nor Fabric currently offers a native "import my_corporate_library" statement for DAX or M that works dynamically at runtime in the reporting layer.
To create a true Enterprise Function Library, you must shift from a "Runtime Call" mindset to a "Deploy & Inject" mindset.
Here are the best practice patterns to architect this today.
Since M code cannot be "called" externally by a PBIX file, you have two routes to centralize ETL logic:
The "Compute" Library (Dataflows Gen2):
Concept: You create a specific Fabric Workspace called "ETL_Library". You create Dataflows that contain your standard functions (e.g., fxCleanPhoneNumbers, fxFiscalCalendar).
Usage: Downstream developers connect to the Result of these dataflows.
Limitation: This centralizes the execution, not the function code itself for local reuse.
The "Code" Library (Custom Connectors):
Concept: This is the only true "Function Library" approach for M. You wrap your .m scripts into a Power Query Custom Connector (.mez) using the Power Query SDK.
DevOps: Store individual .pq files in Git. Use a build pipeline to compile them into a .mez file.
Usage: Developers install this connector. They see your company icon in "Get Data" and can invoke MyCorp.CalculateWorkingDays() directly in their queries.
DAX does not support persistent User Defined Functions (UDFs) in the model. A measure defined in one model is invisible to another.
Calculation Groups (The Modern "UDF"):
Instead of writing Sales YTD, Cost YTD, Margin YTD, you create a Calculation Group called "Time Intelligence".
It contains Items like "YTD", "YoY", "PY".
Benefit: This is a single generic logic block that applies to any measure in the model. This is the closest native feature to a "DAX Library."
C# Scripts (Tabular Editor - The "Injection" Method):
Concept: You store your DAX patterns (standard KPIs, color logic) as C# Scripts in DevOps.
Usage: Developers do not "write" the DAX. They open Tabular Editor, run the script Inject_Standard_Measures, and the script programmatically writes the DAX into their model based on the columns they select.
Maintenance: When the standard changes, you update the script. Models are updated during their next deployment cycle.
To manage this, you need a repository structure that separates logic from the .pbip or .bim files.
| Layer | Object Type | Storage Format (DevOps) | Deployment Mechanism |
| ETL | M Function | .pq (Text files) | Custom Connector (Build) OR Fabric Notebooks (Python libraries) |
| Semantic | DAX Logic | .cs (C# Scripts) or .json (TMDL) | Tabular Editor CLI (during CI/CD pipeline) |
| Generic | Time Intel | .json (Calculation Group) | TMDL (Fabric Git Integration) |
For M: Use Fabric Dataflows for shared data, but build a Custom Connector if you need shared logic (functions) across desktop files.
For DAX: Do not look for UDFs. Lean heavily on Calculation Groups for runtime logic and Tabular Editor Scripts for build-time standardization.
For Fabric: If you are in Fabric, consider moving complex "Function" logic out of M/DAX and into Notebooks (Python/PySpark). Python supports native library imports (import my_corp_utils), which solves this problem instantly for the ETL layer.
Next Step:
Would you like an example of a C# Script for Tabular Editor that automatically generates a set of standard Time Intelligence measures for selected columns?
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.
Thankyou @burakkaragoz for your response.
Hi adudani,
We would like to enquire whether the solution provided by @burakkaragoz resolves your issue. We hope the information clarifies your query. If you have any further questions, please feel free to contact the Microsoft Fabric community.
Thank you.
Hi @adudani ,
This is a critical architectural step for scaling analytics. You have correctly identified the gap: unlike Python or SQL, neither Power BI nor Fabric currently offers a native "import my_corporate_library" statement for DAX or M that works dynamically at runtime in the reporting layer.
To create a true Enterprise Function Library, you must shift from a "Runtime Call" mindset to a "Deploy & Inject" mindset.
Here are the best practice patterns to architect this today.
Since M code cannot be "called" externally by a PBIX file, you have two routes to centralize ETL logic:
The "Compute" Library (Dataflows Gen2):
Concept: You create a specific Fabric Workspace called "ETL_Library". You create Dataflows that contain your standard functions (e.g., fxCleanPhoneNumbers, fxFiscalCalendar).
Usage: Downstream developers connect to the Result of these dataflows.
Limitation: This centralizes the execution, not the function code itself for local reuse.
The "Code" Library (Custom Connectors):
Concept: This is the only true "Function Library" approach for M. You wrap your .m scripts into a Power Query Custom Connector (.mez) using the Power Query SDK.
DevOps: Store individual .pq files in Git. Use a build pipeline to compile them into a .mez file.
Usage: Developers install this connector. They see your company icon in "Get Data" and can invoke MyCorp.CalculateWorkingDays() directly in their queries.
DAX does not support persistent User Defined Functions (UDFs) in the model. A measure defined in one model is invisible to another.
Calculation Groups (The Modern "UDF"):
Instead of writing Sales YTD, Cost YTD, Margin YTD, you create a Calculation Group called "Time Intelligence".
It contains Items like "YTD", "YoY", "PY".
Benefit: This is a single generic logic block that applies to any measure in the model. This is the closest native feature to a "DAX Library."
C# Scripts (Tabular Editor - The "Injection" Method):
Concept: You store your DAX patterns (standard KPIs, color logic) as C# Scripts in DevOps.
Usage: Developers do not "write" the DAX. They open Tabular Editor, run the script Inject_Standard_Measures, and the script programmatically writes the DAX into their model based on the columns they select.
Maintenance: When the standard changes, you update the script. Models are updated during their next deployment cycle.
To manage this, you need a repository structure that separates logic from the .pbip or .bim files.
| Layer | Object Type | Storage Format (DevOps) | Deployment Mechanism |
| ETL | M Function | .pq (Text files) | Custom Connector (Build) OR Fabric Notebooks (Python libraries) |
| Semantic | DAX Logic | .cs (C# Scripts) or .json (TMDL) | Tabular Editor CLI (during CI/CD pipeline) |
| Generic | Time Intel | .json (Calculation Group) | TMDL (Fabric Git Integration) |
For M: Use Fabric Dataflows for shared data, but build a Custom Connector if you need shared logic (functions) across desktop files.
For DAX: Do not look for UDFs. Lean heavily on Calculation Groups for runtime logic and Tabular Editor Scripts for build-time standardization.
For Fabric: If you are in Fabric, consider moving complex "Function" logic out of M/DAX and into Notebooks (Python/PySpark). Python supports native library imports (import my_corp_utils), which solves this problem instantly for the ETL layer.
Next Step:
Would you like an example of a C# Script for Tabular Editor that automatically generates a set of standard Time Intelligence measures for selected columns?
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.
Thank you so much @burakkaragoz !
This is very insightful
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 12 | |
| 6 | |
| 5 | |
| 4 | |
| 4 |
| User | Count |
|---|---|
| 23 | |
| 22 | |
| 14 | |
| 12 | |
| 10 |