Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
J_Balaji
Frequent Visitor

How to Enable Both Schema Updates & Row-Level Security in Fabric Semantic Models?

We have set up a Lakehouse in Microsoft Fabric and are using a Dataflow to fetch data from SQL Server and load it into the Lakehouse. With the Lakehouse, we automatically get a SQL Analytics Endpoint and a Semantic Model.

Upon testing, we identified two types of models:

1. Default Semantic Model (Auto-generated with "Automatically update semantic model" enabled)

  • Supports schema updates when switching environments
  • Does not allow Row-Level Security (RLS) as "Open data model" is disabled

2. New Semantic Model (Created manually)

  • Allows RLS, as "Open data model" is enabled
  • Does not support automatic schema updates when environment changes in Dataflow
  • When tried to update server details in dataflow and refresh the semantic model, encountered below error.

    J_Balaji_0-1738235136153.png

     

The Challenge:

We need both schema update support (when environment changes in Dataflow) and the ability to apply RLS. Is there a workaround or best practice to achieve this in Fabric?

Any guidance on how to combine both functionalities or an alternative approach would be appreciated! :

#MicrosoftFabric #PowerBI #FabricLakehouse #SemanticModel #Dataflows #RowLevelSecurity #RLS #SchemaUpdates #DataModeling #PowerBIService #LakehouseAnalytics #SQLAnalytics #FabricCommunity

1 ACCEPTED SOLUTION

Hi @v-hashadapu ,

 

We found a feasible workaround to handle the schema update issue using XMLA endpoint with C#(.NET) from Visual Studio.

The implementation is in progress.

 

Thank you for the support.

View solution in original post

9 REPLIES 9
v-hashadapu
Community Support
Community Support

Hi @J_Balaji , please share the details of your answer here and mark it as 'Accept as solution' to assist others with similar problems. If it did not, please provide further details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @J_Balaji, Hope your issue is solved. If it is, please consider marking the answer 'Accept as solution', so others with similar issues may find it easily. If it isn't, please share the details.
Thank you.

J_Balaji
Frequent Visitor

Hi @v-hashadapu ,

Firstly I would thank you so much for considering the issue and giving time to post the reply.

As for my understanding you suggested two methods to test.

Method 1:

  • I tried creating new semantic model from Power BI desktop with "default semantic model" created under Lakehouse as source,
  • Unfortunately in the available semantic model sources list, the deafult model name was not showing up instead it was showing other semantic models which we pulished via pipeline and manually.
  • So was unable to connect to that model on which we enabled "automatically update semantic model" option.
  • So I connected to lakehouse source and published to PBI service.
  • This file was not able to capture the schema changes that I made later in Dataflow after manual refresh, even though the refresh was successful. 

Please correct if I implemented in different way.

 

Method 2:
"you can manually enable the "Open data model" option in the default semantic model and handle schema updates using Power BI Desktop" --

  •  I checked various option to enable that option but didn't find any ways. If you have any ideas please suggest me on that.

"or automate them with scripts"

  • you are talking about using Visual Studio code to connect to model and modify the schema using C# and .Net??

 

Thanks in advance : )

 

Hi @J_Balaji , thank you for reaching out to the Microsoft Fabric Community Forum.

  1. The default semantic model created in Fabric’s Lakehouse is not listed as a source in Power BI Desktop because only manually published semantic models appear in the available sources. Connect directly to the Lakehouse tables using Direct Lake mode in Power BI Desktop.
  2. When you create a new semantic model using Lakehouse tables as a source, it does not automatically update when schema changes occur in the Dataflow because the schema is set at the time of model creation. Use Power BI REST APIs or XMLA endpoints to refresh and synchronize schema updates. For multiple environments, use Power BI Deployment Pipelines to ensure schema consistency.
  3. There is no direct option in the Power BI UI to enable the "Open data model" setting for the default semantic model. Enable XMLA read/write mode and use Tabular Editor or Power BI REST APIs to modify the model. This allows you to apply RLS and manage schema updates within the default semantic model.
  4. The reference to automating schema updates does not mean using Visual Studio, C#, or .NET. Instead, use Power BI REST APIs to automate dataset refresh and schema synchronization, XMLA Endpoints to modify the model dynamically without breaking existing reports, Tabular Editor to update model metadata, including enabling RLS, via scripting.

If this helps, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details.
Thank you.

Hi @v-hashadapu ,

Thanks for following up with the issue.

 

Sorry for the delayed response. Since I was off for last couple of days I couldn't able to look at your replies.


 

Spoiler
2.When you create a new semantic model using Lakehouse tables as a source, it does not automatically update when schema changes occur in the Dataflow because the schema is set at the time of model creation. Use Power BI REST APIs or XMLA endpoints to refresh and synchronize schema updates

 

About this, I didn't find a way to refresh and synchronise schema updates. It would be great if you can provide some refrences to get an idea of implementation.

 

 

Spoiler
4. Instead, use Power BI REST APIs to automate dataset refresh and schema synchronization, XMLA Endpoints to modify the model dynamically without breaking existing reports, Tabular Editor to update model metadata, including enabling RLS, via scripting

Reg this,

Does this point consists of two ideas to go through? like in the first line, you mentioned to use Power BI REST APIs to refresh and synchronise the schema and in the next line you said to use XMLA to modify metadata. Is that still needed to update metadata even if we find someway to use Rest API to synchronize schema ??

 

Thanks in advance 🙂

 

 

.

 

Hi @J_Balaji, thank you for reaching out to the Microsoft Fabric Community Forum.

 

  1. Yes, the two points refer to different aspects. Power BI REST APIs are used to trigger dataset refreshes and synchronize schema changes from the Dataflow to the semantic model, while XMLA Endpoints are used to modify model metadata dynamically.
  2. If the REST API successfully synchronizes schema updates, you may not need XMLA for schema updates. However, XMLA is required for advanced model modifications like enabling RLS.
  3. For implementation, please refer to:

    https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-connect-tools

    https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/refresh-dataset-in-group

If this helps, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details.
Thank you.

Hi @v-hashadapu ,

 

We found a feasible workaround to handle the schema update issue using XMLA endpoint with C#(.NET) from Visual Studio.

The implementation is in progress.

 

Thank you for the support.

Hi @J_Balaji, We are pleased to hear that you have found a workaround. Kindly let us know if it successfully resolved your issue. If it did, please share the details here and mark it as 'Accept as solution' to assist others with similar problems. If it did not, please provide further details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @J_Balaji , thank you for reaching out to the Microsoft Fabric Community Forum.


To achieve both schema update support and Row-Level Security (RLS) in Microsoft Fabric, please use the following approach:

  1. Keep the default semantic model for its ability to handle schema updates when the Dataflow environment changes. Use this model for data ingestion and schema management.
  2. Create a new semantic model manually and enable the "Open data model" option. Use this model for applying RLS and building reports.
  3. Use the default semantic model as the source for the new semantic model. This ensures that schema updates from the default model propagate to the custom model.
  4. For best practice, Test the integration between the models to ensure schema updates propagate correctly. Monitor the performance of the custom semantic model, especially if using DirectQuery. Document the steps for schema updates and RLS configuration to ensure consistency across environments.
  5. Alternatively, if you prefer to use a single semantic model, you can manually enable the "Open data model" option in the default semantic model and handle schema updates using Power BI Desktop or automate them with scripts.

If this helps, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details.
Thank you.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.