Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Fabric Data Days Monthly is back. Join us on March 26th for two expert-led sessions on 1) Getting Started with Fabric IQ and 2) Mapping & Spacial Analytics in Fabric. Register now

Reply
daircom
Resolver II
Resolver II

Open discussion: Challenging the "best practice" Fabric solution

Hi all,

 

We are currently working on the implemenation of Microsoft Fabric in our organization, based on best practices.

However, the implementation is challenged by report builders, and to be honest I have to agree with them.

 

The idea of the inmplementation is in line with Fabric best practices/ how Microsoft ment it:

Bring all the data to OneLake via a medaillion structure, create semantic models per business unit, let report builders build reports based on the defined semantic models.

The advantages of using preset semantic models are obvious to me:

- Amount of places where business logic takes place is reduced

- If a change occurs in the underlying data you only have to make adjustments in a few places 

 

However:

Our report users are very frustrated as when they use a directlake connection to the predfined semantic models, they:

- cannot see the measure formulas

- can't even adjust formatting of values

- they can't make the smallest adjustment to the models, they have to consult BI engineers for even the smallest changes

- working with directlake has severe limitations comapred to import mode (e.g. not possible to create calculated tables/columns using DAX)

 

To be honoust: I agree with the report builders. Building a report is very frustrating now. 

 

So honestly: Altough I see all the benefits of the default Fabric approach, I do not agree with this centralized semantic model per business unit apporach as it severly limits reporting capabilities and frustrates our report builders.

 

What are your opinions on this?  

1 ACCEPTED SOLUTION
burakkaragoz
Super User
Super User

Hi,

Balancing "Best Practice" Approaches in Microsoft Fabric
In your forum post, you mention a very valid challenge. You are experiencing a tension between Microsoft Fabric's recommended “best practice” approach and the needs of your reporting users.
I Understand Your Issues
Constraints your report builders face when you use predefined semantic models with DirectLake connectivity:

Not being able to see measure formulas Not being able to adjust value formatting
Having to consult BI engineers for even minor changes
Not being able to create calculated tables/columns using DAX

This is the biggest drawback of the centralized semantic model approach: it reduces flexibility in the report creation process.
Middle Ground Recommendations
Here are a few solutions that can meet the needs of your report builders without losing the benefits of the centralized semantic model approach:

Hybrid Approach:

Create base semantic models for business units
But allow report generators to use a "composite model"
This allows them to add their own calculated metrics when using the base model


Authorized Self-Service Layer:

Keep centralized semantic models as an "enterprise baseline"
Give report authors the right to create their own semantic models, but based on the centralized model
For this approach, see the "Power BI Thin Models" concept


Collaboration Process:

Allow report authors to submit semantic model change requests on a regular basis

View solution in original post

4 REPLIES 4
v-veshwara-msft
Community Support
Community Support

Hi @daircom ,

Just checking in to see  if any responses were helpful. If so, kindly consider marking the helpful reply as 'Accepted Solution' to help others with similar queries. 

Otherwise, feel free to reach out for further assistance.

Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @daircom ,

Thanks for engaing with Microsoft Fabric Community and bringing up such an important and relatable issue. It’s clear that while the centralized semantic model approach in Fabric brings many benefits, it also presents some real challenges, especially when it comes to empowering report builders with the flexibility they need.

 

I completely understand the frustration around the limitations of DirectLake, especially when it comes to things like not being able to see DAX formulas, adjusting formatting, or creating calculated columns/tables. These restrictions can create a lot of dependency on BI teams for even minor adjustments, which can slow things down and frustrate users.

 

Some great points from @burakkaragoz on this - here's a summary of the ideas that could help find a balance:

  • Instead of just relying on DirectLake, consider using Import + DirectQuery so users can extend the base model with their own calculations and logic while still leveraging the certified data.

  • Give report authors more control by allowing them to create their own semantic models on top of the central models. This could empower them without completely breaking the governance model.

  •  A formal process for submitting requests to change the centralized model could help prevent bottlenecks while still ensuring that changes are managed effectively.

 

Regarding the siloed model per business unit, I think it’s something worth revisiting. Having models for each unit can create barriers to cross-business insights, and adding shared models could reduce fragmentation without losing the benefits of structured, governed data.

 

In the end, best practices are just that - guides, not strict rules. There’s definitely room to adapt the approach based on the unique needs of your team and report builders.

 

 

If this helps, please mark it as the accepted solution to assist others with similar queries and a kudos would be appreciated.

Best,
Vinay.
Fabric Community Support.

burakkaragoz
Super User
Super User

Hi,

Balancing "Best Practice" Approaches in Microsoft Fabric
In your forum post, you mention a very valid challenge. You are experiencing a tension between Microsoft Fabric's recommended “best practice” approach and the needs of your reporting users.
I Understand Your Issues
Constraints your report builders face when you use predefined semantic models with DirectLake connectivity:

Not being able to see measure formulas Not being able to adjust value formatting
Having to consult BI engineers for even minor changes
Not being able to create calculated tables/columns using DAX

This is the biggest drawback of the centralized semantic model approach: it reduces flexibility in the report creation process.
Middle Ground Recommendations
Here are a few solutions that can meet the needs of your report builders without losing the benefits of the centralized semantic model approach:

Hybrid Approach:

Create base semantic models for business units
But allow report generators to use a "composite model"
This allows them to add their own calculated metrics when using the base model


Authorized Self-Service Layer:

Keep centralized semantic models as an "enterprise baseline"
Give report authors the right to create their own semantic models, but based on the centralized model
For this approach, see the "Power BI Thin Models" concept


Collaboration Process:

Allow report authors to submit semantic model change requests on a regular basis

lbendlin
Super User
Super User

Interesting point about Direct Lake connections not being mutable, will have to investigate that more.

 

But this here 

create semantic models per business unit

is (in my personal opinion) the greater issue. This silo'ed approach is the opposite of what is needed to provide insights across the entire company.

 

And nobody is complaining about the storage proliferation?

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

February Fabric Update Carousel

Fabric Monthly Update - February 2026

Check out the February 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.