Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
adudani
Super User
Super User

Function Library - Create and Maintain

Hello,

 

Looking to understand common approaches, considerations, and best practices to create and manage an enterprise function library. Ideally, Fabric Items/Power BI Desktop should be able to call scripts/functions for ETL. Functions would be a combination of:

  • DAX UDFs in Fabric
  • DAX UDFs in Power BI Desktop
  • Power Query M functions

These functions could be stored as txt DevOps, if required.

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a Kudos 🙂
Kind Regards,
Avinash
2 REPLIES 2
burakkaragoz
Community Champion
Community Champion

Hi @adudani ,

This is a critical architectural step for scaling analytics. You have correctly identified the gap: unlike Python or SQL, neither Power BI nor Fabric currently offers a native "import my_corporate_library" statement for DAX or M that works dynamically at runtime in the reporting layer.

To create a true Enterprise Function Library, you must shift from a "Runtime Call" mindset to a "Deploy & Inject" mindset.

Here are the best practice patterns to architect this today.

1. The Strategy for Power Query (M) Functions

Since M code cannot be "called" externally by a PBIX file, you have two routes to centralize ETL logic:

  • The "Compute" Library (Dataflows Gen2):

    • Concept: You create a specific Fabric Workspace called "ETL_Library". You create Dataflows that contain your standard functions (e.g., fxCleanPhoneNumbers, fxFiscalCalendar).

    • Usage: Downstream developers connect to the Result of these dataflows.

    • Limitation: This centralizes the execution, not the function code itself for local reuse.

  • The "Code" Library (Custom Connectors):

    • Concept: This is the only true "Function Library" approach for M. You wrap your .m scripts into a Power Query Custom Connector (.mez) using the Power Query SDK.

    • DevOps: Store individual .pq files in Git. Use a build pipeline to compile them into a .mez file.

    • Usage: Developers install this connector. They see your company icon in "Get Data" and can invoke MyCorp.CalculateWorkingDays() directly in their queries.

2. The Strategy for DAX (Semantic Layer)

DAX does not support persistent User Defined Functions (UDFs) in the model. A measure defined in one model is invisible to another.

  • Calculation Groups (The Modern "UDF"):

    • Instead of writing Sales YTD, Cost YTD, Margin YTD, you create a Calculation Group called "Time Intelligence".

    • It contains Items like "YTD", "YoY", "PY".

    • Benefit: This is a single generic logic block that applies to any measure in the model. This is the closest native feature to a "DAX Library."

  • C# Scripts (Tabular Editor - The "Injection" Method):

    • Concept: You store your DAX patterns (standard KPIs, color logic) as C# Scripts in DevOps.

    • Usage: Developers do not "write" the DAX. They open Tabular Editor, run the script Inject_Standard_Measures, and the script programmatically writes the DAX into their model based on the columns they select.

    • Maintenance: When the standard changes, you update the script. Models are updated during their next deployment cycle.

3. Architecture Overview: The DevOps Flow

To manage this, you need a repository structure that separates logic from the .pbip or .bim files.

LayerObject TypeStorage Format (DevOps)Deployment Mechanism
ETLM Function.pq (Text files)Custom Connector (Build) OR Fabric Notebooks (Python libraries)
SemanticDAX Logic.cs (C# Scripts) or .json (TMDL)Tabular Editor CLI (during CI/CD pipeline)
GenericTime Intel.json (Calculation Group)TMDL (Fabric Git Integration)

Summary Recommendation

  1. For M: Use Fabric Dataflows for shared data, but build a Custom Connector if you need shared logic (functions) across desktop files.

  2. For DAX: Do not look for UDFs. Lean heavily on Calculation Groups for runtime logic and Tabular Editor Scripts for build-time standardization.

  3. For Fabric: If you are in Fabric, consider moving complex "Function" logic out of M/DAX and into Notebooks (Python/PySpark). Python supports native library imports (import my_corp_utils), which solves this problem instantly for the ETL layer.

Next Step:

Would you like an example of a C# Script for Tabular Editor that automatically generates a set of standard Time Intelligence measures for selected columns?


If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.

This response was assisted by AI for translation and formatting purposes.

Thank you so much  @burakkaragoz !

This is very insightful

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a Kudos 🙂
Kind Regards,
Avinash

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.