Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
J_Balaji
Frequent Visitor

How to Enable Both Schema Updates & Row-Level Security in Fabric Semantic Models?

We have set up a Lakehouse in Microsoft Fabric and are using a Dataflow to fetch data from SQL Server and load it into the Lakehouse. With the Lakehouse, we automatically get a SQL Analytics Endpoint and a Semantic Model.

Upon testing, we identified two types of models:

1. Default Semantic Model (Auto-generated with "Automatically update semantic model" enabled)

  • Supports schema updates when switching environments
  • Does not allow Row-Level Security (RLS) as "Open data model" is disabled

2. New Semantic Model (Created manually)

  • Allows RLS, as "Open data model" is enabled
  • Does not support automatic schema updates when environment changes in Dataflow
  • When tried to update server details in dataflow and refresh the semantic model, encountered below error.

    J_Balaji_0-1738235136153.png

     

The Challenge:

We need both schema update support (when environment changes in Dataflow) and the ability to apply RLS. Is there a workaround or best practice to achieve this in Fabric?

Any guidance on how to combine both functionalities or an alternative approach would be appreciated! :

#MicrosoftFabric #PowerBI #FabricLakehouse #SemanticModel #Dataflows #RowLevelSecurity #RLS #SchemaUpdates #DataModeling #PowerBIService #LakehouseAnalytics #SQLAnalytics #FabricCommunity

2 ACCEPTED SOLUTIONS

Hi @v-hashadapu ,

 

We found a feasible workaround to handle the schema update issue using XMLA endpoint with C#(.NET) from Visual Studio.

The implementation is in progress.

 

Thank you for the support.

View solution in original post

J_Balaji
Frequent Visitor

Here is the code to modify schema programmatically. I used Python to update server and database parameters.

 

Main.cs

using Microsoft.AnalysisServices.Tabular;
using System.Data;

namespace UdfCode
{
    public class MainService
    {
        private readonly AadService _aadService;
        private readonly Config _config;
        private readonly IHelperService _helperService;

        public MainService(AadService aadService, Config config, IHelperService helperService)
        {
            _aadService = aadService;
            _config = config;
            _helperService = helperService;
        }

        public async Task Run()
        {
            try
            {
                string? accessToken = await _aadService.GetAccessToken();

                if (string.IsNullOrEmpty(accessToken))
                {
                    throw new ArgumentNullException(nameof(accessToken), "Access token retrieval failed.");
                }

                string sqlConnectionString = _config.SqlConnectionString ?? throw new ArgumentNullException(nameof(_config.SqlConnectionString));
                string spectrumDB = _config.SpectrumDB ?? throw new ArgumentNullException(nameof(_config.SpectrumDB));
                string dimUdfQuery = _config.DimUdfQuery ?? throw new ArgumentNullException(nameof(_config.DimUdfQuery));
                string factUdfQuery = _config.FactUdfQuery ?? throw new ArgumentNullException(nameof(_config.FactUdfQuery));
                string PartitionDatabase = _config.PartitionDatabase ?? throw new ArgumentNullException(nameof(_config.PartitionDatabase));
                string KeyColumn = _config.KeyColumn ?? throw new ArgumentNullException(nameof(_config.KeyColumn));
                string TableName = _config.TableName ?? throw new ArgumentNullException(nameof(_config.TableName));
                string DimUdfTable = _config.DimUdfTable ?? throw new ArgumentNullException(nameof(_config.DimUdfTable));
                string FactUdfTable = _config.FactUdfTable ?? throw new ArgumentNullException(nameof(_config.FactUdfTable));
                string workspaceName = _config.workspaceName ?? throw new ArgumentNullException(nameof(_config.workspaceName));
                string databaseId = _config.DatabaseId ?? throw new ArgumentNullException(nameof(_config.DatabaseId));
                
                DataTable dimUdfDataTable = _helperService.LoadDataTable(
                    sqlConnectionString,
                    dimUdfQuery.Replace("{SpectrumDB}", spectrumDB),
                    "dimUdfDataTable",
                    out int dimUdfCount
                );

                DataTable factUdfDataTable = _helperService.LoadDataTable(
                    sqlConnectionString,
                    factUdfQuery.Replace("{SpectrumDB}", spectrumDB),
                    "factUdfDataTable",
                    out int factUdfCount
                );
               
                Model? model = _helperService.EstablishWorkspaceConnection(_config.workspaceName, _config.DatabaseId, accessToken) ?? throw new InvalidOperationException("Failed to establish workspace connection.");
                Table modelConnectingTable = model.Tables[_config.TableName];
                
                if (dimUdfCount == 0 && factUdfCount == 0)
                {
                    bool removedDim = _helperService.RemoveTableAndRelationships(model, _config.DimUdfTable);
                    bool removedFact = _helperService.RemoveTableAndRelationships(model, _config.FactUdfTable);

                    if (removedDim || removedFact)
                    {
                        model.RequestRefresh(RefreshType.Full);
                        Console.WriteLine($"\nPower BI Semantic Model \"{_helperService.DatasetName}\" refresh triggered!");
                        model.SaveChanges();
                        Console.WriteLine($"\nPower BI Semantic Model \"{_helperService.DatasetName}\" updated successfully!");
                    }
                    else
                    {
                        Console.WriteLine("\nNo UD fields found. Terminating the process.");
                    }
                }
                else
                {
                    _helperService.RemoveTableAndRelationships(model, _config.DimUdfTable);
                    _helperService.RemoveTableAndRelationships(model, _config.FactUdfTable);

                    if (dimUdfCount > 0)
                    {
                        Table dimUserDefinedFields = _helperService.CreateAndAddTable(model, _config.DimUdfTable);
                        _helperService.AddColumnsToTable(dimUserDefinedFields, dimUdfDataTable);
                        _helperService.CreatePartition(dimUserDefinedFields, "dimUserDefinedFields Partition", _config.DimUdfQuery, _config.PartitionDatabase);
                        _helperService.CreateRelationship(model, dimUserDefinedFields, modelConnectingTable, _config.KeyColumn, _config.TableName, _config.IsManyToOne);
                    }

                    if (factUdfCount > 0)
                    {
                        Table factUserDefinedFields = _helperService.CreateAndAddTable(model, _config.FactUdfTable);
                        _helperService.AddColumnsToTable(factUserDefinedFields, factUdfDataTable);
                        _helperService.CreatePartition(factUserDefinedFields, "factUserDefinedFields Partition", _config.FactUdfQuery, _config.PartitionDatabase);
                        _helperService.CreateRelationship(model, factUserDefinedFields, modelConnectingTable, _config.KeyColumn, _config.TableName, _config.IsManyToOne);
                    }

                    model.RequestRefresh(RefreshType.Full);
                    Console.WriteLine($"\nPower BI Semantic Model \"{_helperService.DatasetName}\" refresh triggered!");
                    model.SaveChanges();
                    Console.WriteLine($"\nPower BI Semantic Model \"{_helperService.DatasetName}\" updated successfully!");
                }
            }
            catch (Exception ex)
            {
                Console.WriteLine($"An error occurred: {ex.Message}");
            }
        }
    }
}

 

 

Program.cs

using System.Data;
using System.Text;
using Microsoft.Data.SqlClient;
using Microsoft.AnalysisServices.Tabular;
using TabularDataColumn = Microsoft.AnalysisServices.Tabular.DataColumn;

namespace UdfCode
{
    public class HelperService : IHelperService
    {
        public Server? server;
        public string DatasetName { get; private set; } = string.Empty;

        public Model? EstablishWorkspaceConnection(string workspaceName, string databaseId, string accessToken)
        {
            try
            {
                Console.WriteLine("\nConnecting to Power BI Workspace...");
                string workspaceConnection = $"powerbi://api.powerbi.com/v1.0/myorg/{workspaceName}";
                Console.WriteLine($"Workspace Connection: {workspaceConnection}");
                Console.WriteLine($"Dataset ID: {databaseId}");
                string connectString = $"Provider=MSOLAP;Data Source={workspaceConnection};User ID=;Password={accessToken};Persist Security Info=True;Impersonation Level=Impersonate;";
                Server server = new();
                server.Connect(connectString);
                Console.WriteLine("Connection established!");
                Model model = server.Databases[databaseId].Model;
                DatasetName = server.Databases[databaseId].Name;
                Console.WriteLine($"Connected to PBI Dataset: {model.Name}");
                return model;
            }
            catch (Exception ex)
            {
                Console.WriteLine($"Error: {ex.Message}");
                Console.WriteLine($"StackTrace: {ex.StackTrace}");
                return null;
            }
        }

        public DataTable LoadDataTable(string ConnectionString, string query, string tableName, out int ColumnCount)
        {
            DataTable dataTable = new();
            using (SqlConnection sqlConnection = new(ConnectionString))
            {
                Console.WriteLine("\nEstablishing SQL Server Connection");
                sqlConnection.Open();
                Console.WriteLine($"\nSQL Server connection successful for {tableName}!");
                using (SqlDataAdapter adapter = new(query, sqlConnection))
                {
                    adapter.Fill(dataTable);
                }
                Console.WriteLine($"\n{tableName} Loaded!");
                sqlConnection.Close();
            }
            ColumnCount = dataTable.Columns.Count - 1;
            Console.WriteLine($"UD Columns fetched into {tableName}: {ColumnCount}");
            return dataTable;
        }

        public bool RemoveTableAndRelationships(Model model, string tableName)
        {
            if (model.Tables.Contains(tableName))
            {
                Table table = model.Tables[tableName];
                var relationships = model.Relationships
                    .Where(r => r is SingleColumnRelationship rel &&
                                (rel.FromTable.Name == table.Name || rel.ToTable.Name == table.Name))
                    .ToList();
                foreach (var rel in relationships)
                {
                    model.Relationships.Remove(rel);
                }
                model.Tables.Remove(table);
                Console.WriteLine($"\nDeleted table: {tableName}");
                return true;
            }
            else
            {
                Console.WriteLine($"\nTable '{tableName}' not found in the model.");
                return false;
            }
        }

        public Table CreateAndAddTable(Model model, string tableName)
        {
            Table newTable = new()
            {
                Name = tableName,
                IsHidden = false
            };
            model.Tables.Add(newTable);
            Console.WriteLine($"\nCreated new table: {tableName}");
            return newTable;
        }

        public void AddColumnsToTable(Table targetTable, DataTable sourceDataTable)
        {
            for (int i = 0; i < sourceDataTable.Columns.Count; i++)
            {
                string columnName = sourceDataTable.Columns[i].ColumnName;
                Type columnType = sourceDataTable.Columns[i].DataType;
                DataType tabularDataType = ConvertToTabularDataType(columnType);
                if (!targetTable.Columns.ContainsName(columnName))
                {
                    TabularDataColumn newColumn = new()
                    {
                        Name = columnName,
                        DataType = tabularDataType,
                        SourceColumn = columnName,
                        IsHidden = (i == 0)
                    };

                    targetTable.Columns.Add(newColumn);
                    Console.WriteLine($"Added new column into {targetTable.Name}: {columnName}");
                }
                else
                {
                    Console.WriteLine($"Column {columnName} already exists in {targetTable.Name}. Skipping...");
                }
            }
        }

        public void CreatePartition(Table table, string partitionName, string query, string database)
        {
            if (table.Partitions.Count == 0)
            {
                Console.WriteLine($"\nPartition getting created for {table.Name}");
                Partition partition = new()
                {
                    Name = partitionName,
                    Mode = ModeType.Import,
                    DataView = DataViewType.Full,
                    Source = new MPartitionSource
                    {
                        Expression = ConvertToMQuery(query, database)
                    }
                };
                table.Partitions.Add(partition);
                Console.WriteLine($"Partition added to {table.Name}");
            }
        }

        public void CreateRelationship(Model model, Table fromTable, Table toTable, string keyColumn, string tableName, bool isManyToOne)
        {
            Console.WriteLine($"\nRelationship between {tableName} and {fromTable.Name} on {keyColumn} has started");
            SingleColumnRelationship relationship = new()
            {
                FromColumn = fromTable.Columns[keyColumn],
                ToColumn = toTable.Columns[keyColumn],
                FromCardinality = isManyToOne ? RelationshipEndCardinality.Many : RelationshipEndCardinality.One,
                ToCardinality = RelationshipEndCardinality.One,
                CrossFilteringBehavior = CrossFilteringBehavior.Automatic,
                IsActive = true
            };
            model.Relationships.Add(relationship);
            Console.WriteLine($"Established {(isManyToOne ? "Many-to-One" : "One-to-One")} relationship between {tableName} and {fromTable.Name} on {keyColumn}");
        }

        public DataType ConvertToTabularDataType(Type columnType)
        {
            if (columnType == typeof(string)) return DataType.String;
            if (columnType == typeof(int) || columnType == typeof(long)) return DataType.Int64;
            if (columnType == typeof(float) || columnType == typeof(double) || columnType == typeof(decimal)) return DataType.Double;
            if (columnType == typeof(bool)) return DataType.Boolean;
            if (columnType == typeof(DateTime)) return DataType.DateTime;
            return DataType.String;
        }

        public string ConvertToMQuery(string sqlQuery, string database)
        {
            sqlQuery = sqlQuery.Replace("{SpectrumDB}", " \" & SpectrumDB & \"");
            StringBuilder mQueryBuilder = new();
            mQueryBuilder.AppendLine("let");
            mQueryBuilder.AppendLine($"    Source = Sql.Database(Server, {database}, [Query = ");
            string[] sqlLines = sqlQuery.Split(new[] { "\r\n", "\n" }, StringSplitOptions.None);
            for (int i = 0; i < sqlLines.Length; i++)
            {
                string line = sqlLines[i].Trim();
                if (!string.IsNullOrWhiteSpace(line))
                {
                    mQueryBuilder.Append("        \"" + line + "#(lf)\"");
                    if (i < sqlLines.Length - 2)
                    {
                        mQueryBuilder.AppendLine(" &");
                    }
                }
            }
            mQueryBuilder.AppendLine("    ])");
            mQueryBuilder.AppendLine("in");
            mQueryBuilder.AppendLine("    Source");
            return mQueryBuilder.ToString();
        }
    }
}

 

 

Please mark it as solution if it is helpful and like the solution too.

View solution in original post

11 REPLIES 11
J_Balaji
Frequent Visitor

Here is the code to modify schema programmatically. I used Python to update server and database parameters.

 

Main.cs

using Microsoft.AnalysisServices.Tabular;
using System.Data;

namespace UdfCode
{
    public class MainService
    {
        private readonly AadService _aadService;
        private readonly Config _config;
        private readonly IHelperService _helperService;

        public MainService(AadService aadService, Config config, IHelperService helperService)
        {
            _aadService = aadService;
            _config = config;
            _helperService = helperService;
        }

        public async Task Run()
        {
            try
            {
                string? accessToken = await _aadService.GetAccessToken();

                if (string.IsNullOrEmpty(accessToken))
                {
                    throw new ArgumentNullException(nameof(accessToken), "Access token retrieval failed.");
                }

                string sqlConnectionString = _config.SqlConnectionString ?? throw new ArgumentNullException(nameof(_config.SqlConnectionString));
                string spectrumDB = _config.SpectrumDB ?? throw new ArgumentNullException(nameof(_config.SpectrumDB));
                string dimUdfQuery = _config.DimUdfQuery ?? throw new ArgumentNullException(nameof(_config.DimUdfQuery));
                string factUdfQuery = _config.FactUdfQuery ?? throw new ArgumentNullException(nameof(_config.FactUdfQuery));
                string PartitionDatabase = _config.PartitionDatabase ?? throw new ArgumentNullException(nameof(_config.PartitionDatabase));
                string KeyColumn = _config.KeyColumn ?? throw new ArgumentNullException(nameof(_config.KeyColumn));
                string TableName = _config.TableName ?? throw new ArgumentNullException(nameof(_config.TableName));
                string DimUdfTable = _config.DimUdfTable ?? throw new ArgumentNullException(nameof(_config.DimUdfTable));
                string FactUdfTable = _config.FactUdfTable ?? throw new ArgumentNullException(nameof(_config.FactUdfTable));
                string workspaceName = _config.workspaceName ?? throw new ArgumentNullException(nameof(_config.workspaceName));
                string databaseId = _config.DatabaseId ?? throw new ArgumentNullException(nameof(_config.DatabaseId));
                
                DataTable dimUdfDataTable = _helperService.LoadDataTable(
                    sqlConnectionString,
                    dimUdfQuery.Replace("{SpectrumDB}", spectrumDB),
                    "dimUdfDataTable",
                    out int dimUdfCount
                );

                DataTable factUdfDataTable = _helperService.LoadDataTable(
                    sqlConnectionString,
                    factUdfQuery.Replace("{SpectrumDB}", spectrumDB),
                    "factUdfDataTable",
                    out int factUdfCount
                );
               
                Model? model = _helperService.EstablishWorkspaceConnection(_config.workspaceName, _config.DatabaseId, accessToken) ?? throw new InvalidOperationException("Failed to establish workspace connection.");
                Table modelConnectingTable = model.Tables[_config.TableName];
                
                if (dimUdfCount == 0 && factUdfCount == 0)
                {
                    bool removedDim = _helperService.RemoveTableAndRelationships(model, _config.DimUdfTable);
                    bool removedFact = _helperService.RemoveTableAndRelationships(model, _config.FactUdfTable);

                    if (removedDim || removedFact)
                    {
                        model.RequestRefresh(RefreshType.Full);
                        Console.WriteLine($"\nPower BI Semantic Model \"{_helperService.DatasetName}\" refresh triggered!");
                        model.SaveChanges();
                        Console.WriteLine($"\nPower BI Semantic Model \"{_helperService.DatasetName}\" updated successfully!");
                    }
                    else
                    {
                        Console.WriteLine("\nNo UD fields found. Terminating the process.");
                    }
                }
                else
                {
                    _helperService.RemoveTableAndRelationships(model, _config.DimUdfTable);
                    _helperService.RemoveTableAndRelationships(model, _config.FactUdfTable);

                    if (dimUdfCount > 0)
                    {
                        Table dimUserDefinedFields = _helperService.CreateAndAddTable(model, _config.DimUdfTable);
                        _helperService.AddColumnsToTable(dimUserDefinedFields, dimUdfDataTable);
                        _helperService.CreatePartition(dimUserDefinedFields, "dimUserDefinedFields Partition", _config.DimUdfQuery, _config.PartitionDatabase);
                        _helperService.CreateRelationship(model, dimUserDefinedFields, modelConnectingTable, _config.KeyColumn, _config.TableName, _config.IsManyToOne);
                    }

                    if (factUdfCount > 0)
                    {
                        Table factUserDefinedFields = _helperService.CreateAndAddTable(model, _config.FactUdfTable);
                        _helperService.AddColumnsToTable(factUserDefinedFields, factUdfDataTable);
                        _helperService.CreatePartition(factUserDefinedFields, "factUserDefinedFields Partition", _config.FactUdfQuery, _config.PartitionDatabase);
                        _helperService.CreateRelationship(model, factUserDefinedFields, modelConnectingTable, _config.KeyColumn, _config.TableName, _config.IsManyToOne);
                    }

                    model.RequestRefresh(RefreshType.Full);
                    Console.WriteLine($"\nPower BI Semantic Model \"{_helperService.DatasetName}\" refresh triggered!");
                    model.SaveChanges();
                    Console.WriteLine($"\nPower BI Semantic Model \"{_helperService.DatasetName}\" updated successfully!");
                }
            }
            catch (Exception ex)
            {
                Console.WriteLine($"An error occurred: {ex.Message}");
            }
        }
    }
}

 

 

Program.cs

using System.Data;
using System.Text;
using Microsoft.Data.SqlClient;
using Microsoft.AnalysisServices.Tabular;
using TabularDataColumn = Microsoft.AnalysisServices.Tabular.DataColumn;

namespace UdfCode
{
    public class HelperService : IHelperService
    {
        public Server? server;
        public string DatasetName { get; private set; } = string.Empty;

        public Model? EstablishWorkspaceConnection(string workspaceName, string databaseId, string accessToken)
        {
            try
            {
                Console.WriteLine("\nConnecting to Power BI Workspace...");
                string workspaceConnection = $"powerbi://api.powerbi.com/v1.0/myorg/{workspaceName}";
                Console.WriteLine($"Workspace Connection: {workspaceConnection}");
                Console.WriteLine($"Dataset ID: {databaseId}");
                string connectString = $"Provider=MSOLAP;Data Source={workspaceConnection};User ID=;Password={accessToken};Persist Security Info=True;Impersonation Level=Impersonate;";
                Server server = new();
                server.Connect(connectString);
                Console.WriteLine("Connection established!");
                Model model = server.Databases[databaseId].Model;
                DatasetName = server.Databases[databaseId].Name;
                Console.WriteLine($"Connected to PBI Dataset: {model.Name}");
                return model;
            }
            catch (Exception ex)
            {
                Console.WriteLine($"Error: {ex.Message}");
                Console.WriteLine($"StackTrace: {ex.StackTrace}");
                return null;
            }
        }

        public DataTable LoadDataTable(string ConnectionString, string query, string tableName, out int ColumnCount)
        {
            DataTable dataTable = new();
            using (SqlConnection sqlConnection = new(ConnectionString))
            {
                Console.WriteLine("\nEstablishing SQL Server Connection");
                sqlConnection.Open();
                Console.WriteLine($"\nSQL Server connection successful for {tableName}!");
                using (SqlDataAdapter adapter = new(query, sqlConnection))
                {
                    adapter.Fill(dataTable);
                }
                Console.WriteLine($"\n{tableName} Loaded!");
                sqlConnection.Close();
            }
            ColumnCount = dataTable.Columns.Count - 1;
            Console.WriteLine($"UD Columns fetched into {tableName}: {ColumnCount}");
            return dataTable;
        }

        public bool RemoveTableAndRelationships(Model model, string tableName)
        {
            if (model.Tables.Contains(tableName))
            {
                Table table = model.Tables[tableName];
                var relationships = model.Relationships
                    .Where(r => r is SingleColumnRelationship rel &&
                                (rel.FromTable.Name == table.Name || rel.ToTable.Name == table.Name))
                    .ToList();
                foreach (var rel in relationships)
                {
                    model.Relationships.Remove(rel);
                }
                model.Tables.Remove(table);
                Console.WriteLine($"\nDeleted table: {tableName}");
                return true;
            }
            else
            {
                Console.WriteLine($"\nTable '{tableName}' not found in the model.");
                return false;
            }
        }

        public Table CreateAndAddTable(Model model, string tableName)
        {
            Table newTable = new()
            {
                Name = tableName,
                IsHidden = false
            };
            model.Tables.Add(newTable);
            Console.WriteLine($"\nCreated new table: {tableName}");
            return newTable;
        }

        public void AddColumnsToTable(Table targetTable, DataTable sourceDataTable)
        {
            for (int i = 0; i < sourceDataTable.Columns.Count; i++)
            {
                string columnName = sourceDataTable.Columns[i].ColumnName;
                Type columnType = sourceDataTable.Columns[i].DataType;
                DataType tabularDataType = ConvertToTabularDataType(columnType);
                if (!targetTable.Columns.ContainsName(columnName))
                {
                    TabularDataColumn newColumn = new()
                    {
                        Name = columnName,
                        DataType = tabularDataType,
                        SourceColumn = columnName,
                        IsHidden = (i == 0)
                    };

                    targetTable.Columns.Add(newColumn);
                    Console.WriteLine($"Added new column into {targetTable.Name}: {columnName}");
                }
                else
                {
                    Console.WriteLine($"Column {columnName} already exists in {targetTable.Name}. Skipping...");
                }
            }
        }

        public void CreatePartition(Table table, string partitionName, string query, string database)
        {
            if (table.Partitions.Count == 0)
            {
                Console.WriteLine($"\nPartition getting created for {table.Name}");
                Partition partition = new()
                {
                    Name = partitionName,
                    Mode = ModeType.Import,
                    DataView = DataViewType.Full,
                    Source = new MPartitionSource
                    {
                        Expression = ConvertToMQuery(query, database)
                    }
                };
                table.Partitions.Add(partition);
                Console.WriteLine($"Partition added to {table.Name}");
            }
        }

        public void CreateRelationship(Model model, Table fromTable, Table toTable, string keyColumn, string tableName, bool isManyToOne)
        {
            Console.WriteLine($"\nRelationship between {tableName} and {fromTable.Name} on {keyColumn} has started");
            SingleColumnRelationship relationship = new()
            {
                FromColumn = fromTable.Columns[keyColumn],
                ToColumn = toTable.Columns[keyColumn],
                FromCardinality = isManyToOne ? RelationshipEndCardinality.Many : RelationshipEndCardinality.One,
                ToCardinality = RelationshipEndCardinality.One,
                CrossFilteringBehavior = CrossFilteringBehavior.Automatic,
                IsActive = true
            };
            model.Relationships.Add(relationship);
            Console.WriteLine($"Established {(isManyToOne ? "Many-to-One" : "One-to-One")} relationship between {tableName} and {fromTable.Name} on {keyColumn}");
        }

        public DataType ConvertToTabularDataType(Type columnType)
        {
            if (columnType == typeof(string)) return DataType.String;
            if (columnType == typeof(int) || columnType == typeof(long)) return DataType.Int64;
            if (columnType == typeof(float) || columnType == typeof(double) || columnType == typeof(decimal)) return DataType.Double;
            if (columnType == typeof(bool)) return DataType.Boolean;
            if (columnType == typeof(DateTime)) return DataType.DateTime;
            return DataType.String;
        }

        public string ConvertToMQuery(string sqlQuery, string database)
        {
            sqlQuery = sqlQuery.Replace("{SpectrumDB}", " \" & SpectrumDB & \"");
            StringBuilder mQueryBuilder = new();
            mQueryBuilder.AppendLine("let");
            mQueryBuilder.AppendLine($"    Source = Sql.Database(Server, {database}, [Query = ");
            string[] sqlLines = sqlQuery.Split(new[] { "\r\n", "\n" }, StringSplitOptions.None);
            for (int i = 0; i < sqlLines.Length; i++)
            {
                string line = sqlLines[i].Trim();
                if (!string.IsNullOrWhiteSpace(line))
                {
                    mQueryBuilder.Append("        \"" + line + "#(lf)\"");
                    if (i < sqlLines.Length - 2)
                    {
                        mQueryBuilder.AppendLine(" &");
                    }
                }
            }
            mQueryBuilder.AppendLine("    ])");
            mQueryBuilder.AppendLine("in");
            mQueryBuilder.AppendLine("    Source");
            return mQueryBuilder.ToString();
        }
    }
}

 

 

Please mark it as solution if it is helpful and like the solution too.

v-hashadapu
Community Support
Community Support

Hi @J_Balaji , please share the details of your answer here and mark it as 'Accept as solution' to assist others with similar problems. If it did not, please provide further details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @J_Balaji, Hope your issue is solved. If it is, please consider marking the answer 'Accept as solution', so others with similar issues may find it easily. If it isn't, please share the details.
Thank you.

J_Balaji
Frequent Visitor

Hi @v-hashadapu ,

Firstly I would thank you so much for considering the issue and giving time to post the reply.

As for my understanding you suggested two methods to test.

Method 1:

  • I tried creating new semantic model from Power BI desktop with "default semantic model" created under Lakehouse as source,
  • Unfortunately in the available semantic model sources list, the deafult model name was not showing up instead it was showing other semantic models which we pulished via pipeline and manually.
  • So was unable to connect to that model on which we enabled "automatically update semantic model" option.
  • So I connected to lakehouse source and published to PBI service.
  • This file was not able to capture the schema changes that I made later in Dataflow after manual refresh, even though the refresh was successful. 

Please correct if I implemented in different way.

 

Method 2:
"you can manually enable the "Open data model" option in the default semantic model and handle schema updates using Power BI Desktop" --

  •  I checked various option to enable that option but didn't find any ways. If you have any ideas please suggest me on that.

"or automate them with scripts"

  • you are talking about using Visual Studio code to connect to model and modify the schema using C# and .Net??

 

Thanks in advance : )

 

Hi @J_Balaji , thank you for reaching out to the Microsoft Fabric Community Forum.

  1. The default semantic model created in Fabric’s Lakehouse is not listed as a source in Power BI Desktop because only manually published semantic models appear in the available sources. Connect directly to the Lakehouse tables using Direct Lake mode in Power BI Desktop.
  2. When you create a new semantic model using Lakehouse tables as a source, it does not automatically update when schema changes occur in the Dataflow because the schema is set at the time of model creation. Use Power BI REST APIs or XMLA endpoints to refresh and synchronize schema updates. For multiple environments, use Power BI Deployment Pipelines to ensure schema consistency.
  3. There is no direct option in the Power BI UI to enable the "Open data model" setting for the default semantic model. Enable XMLA read/write mode and use Tabular Editor or Power BI REST APIs to modify the model. This allows you to apply RLS and manage schema updates within the default semantic model.
  4. The reference to automating schema updates does not mean using Visual Studio, C#, or .NET. Instead, use Power BI REST APIs to automate dataset refresh and schema synchronization, XMLA Endpoints to modify the model dynamically without breaking existing reports, Tabular Editor to update model metadata, including enabling RLS, via scripting.

If this helps, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details.
Thank you.

Hi @v-hashadapu ,

Thanks for following up with the issue.

 

Sorry for the delayed response. Since I was off for last couple of days I couldn't able to look at your replies.


 

Spoiler
2.When you create a new semantic model using Lakehouse tables as a source, it does not automatically update when schema changes occur in the Dataflow because the schema is set at the time of model creation. Use Power BI REST APIs or XMLA endpoints to refresh and synchronize schema updates

 

About this, I didn't find a way to refresh and synchronise schema updates. It would be great if you can provide some refrences to get an idea of implementation.

 

 

Spoiler
4. Instead, use Power BI REST APIs to automate dataset refresh and schema synchronization, XMLA Endpoints to modify the model dynamically without breaking existing reports, Tabular Editor to update model metadata, including enabling RLS, via scripting

Reg this,

Does this point consists of two ideas to go through? like in the first line, you mentioned to use Power BI REST APIs to refresh and synchronise the schema and in the next line you said to use XMLA to modify metadata. Is that still needed to update metadata even if we find someway to use Rest API to synchronize schema ??

 

Thanks in advance 🙂

 

 

.

 

Hi @J_Balaji, thank you for reaching out to the Microsoft Fabric Community Forum.

 

  1. Yes, the two points refer to different aspects. Power BI REST APIs are used to trigger dataset refreshes and synchronize schema changes from the Dataflow to the semantic model, while XMLA Endpoints are used to modify model metadata dynamically.
  2. If the REST API successfully synchronizes schema updates, you may not need XMLA for schema updates. However, XMLA is required for advanced model modifications like enabling RLS.
  3. For implementation, please refer to:

    https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-connect-tools

    https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/refresh-dataset-in-group

If this helps, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details.
Thank you.

Hi @v-hashadapu ,

 

We found a feasible workaround to handle the schema update issue using XMLA endpoint with C#(.NET) from Visual Studio.

The implementation is in progress.

 

Thank you for the support.

Hi @J_Balaji  glad for you sorted out the schema update isse, I encountered the similar issue recently, is it possiable you share your excellent experience using XMLA endpoint to refresh schema and reframe?

Hi @J_Balaji, We are pleased to hear that you have found a workaround. Kindly let us know if it successfully resolved your issue. If it did, please share the details here and mark it as 'Accept as solution' to assist others with similar problems. If it did not, please provide further details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @J_Balaji , thank you for reaching out to the Microsoft Fabric Community Forum.


To achieve both schema update support and Row-Level Security (RLS) in Microsoft Fabric, please use the following approach:

  1. Keep the default semantic model for its ability to handle schema updates when the Dataflow environment changes. Use this model for data ingestion and schema management.
  2. Create a new semantic model manually and enable the "Open data model" option. Use this model for applying RLS and building reports.
  3. Use the default semantic model as the source for the new semantic model. This ensures that schema updates from the default model propagate to the custom model.
  4. For best practice, Test the integration between the models to ensure schema updates propagate correctly. Monitor the performance of the custom semantic model, especially if using DirectQuery. Document the steps for schema updates and RLS configuration to ensure consistency across environments.
  5. Alternatively, if you prefer to use a single semantic model, you can manually enable the "Open data model" option in the default semantic model and handle schema updates using Power BI Desktop or automate them with scripts.

If this helps, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details.
Thank you.

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.