Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowGet inspired! Check out the entries from the Power BI DataViz World Championships preliminary rounds and give kudos to your favorites. View the vizzies.
Hello, we use Power BI Embed and allow users to save copies of reports. Works great until the semantic model changes, table name change, column name change, etc.
Any user defined reports that refer to the table and/or columns that had a name change no longer work. We have multiple customers so this escalates quickly.
We do our best to minimize this from happening but occassinally it is unavoidable, is there a way to programatically detect the impact in advance and/or automatically apply the updated table/column name changes in user defined reports?
Thanks!
-Rob
Solved! Go to Solution.
Hi @rdumont ,
Power BI Embedded users face report breakage when table or column names change in the semantic model. To prevent this, schema changes must be detected in advance and reports updated automatically.
Using the Power BI REST API, dataset schema can be extracted and compared with previous versions to detect changes. The Get Dataset Tables API fetches table and column structures:
GET https://api.powerbi.com/v1.0/myorg/datasets/{datasetId}/tables
Authorization: Bearer {access_token}
For deeper analysis in Premium/PPU workspaces, XMLA Endpoint DMV queries provide metadata:
SELECT * FROM $System.TMSCHEMA_COLUMNS
Once changes are detected, user reports can be updated by exporting the report JSON, replacing outdated references, and re-importing it:
GET https://api.powerbi.com/v1.0/myorg/reports/{reportId}/export
POST https://api.powerbi.com/v1.0/myorg/reports/import
Authorization: Bearer {access_token}
Content-Type: application/json
{
"name": "Updated Report",
"file": "{modified_json_report}"
}
A more seamless approach is using SQL views to maintain column aliases or creating duplicate calculated columns with old names before renaming. Automating schema checks and report updates ensures minimal user impact. Let me know if you’d like a script to automate this.
Best regards,
Hi @rdumont ,
The best approach to prevent Power BI Embedded reports from breaking due to schema changes is to automate schema detection and report updates. Using the Power BI REST API, you can extract dataset schema and compare it with previous versions to detect renamed or deleted tables and columns. This can be done using the API endpoint:
GET https://api.powerbi.com/v1.0/myorg/datasets/{datasetId}/tables
Authorization: Bearer {access_token}
For Premium or PPU workspaces, XMLA Endpoint DMV queries provide deeper metadata insights:
SELECT * FROM $System.TMSCHEMA_COLUMNS
Once schema changes are detected, user reports can be updated by exporting the report JSON, replacing outdated references, and re-importing it. The export and import operations are performed using the following API endpoints:
GET https://api.powerbi.com/v1.0/myorg/reports/{reportId}/export
Authorization: Bearer {access_token}
POST https://api.powerbi.com/v1.0/myorg/reports/import
Authorization: Bearer {access_token}
Content-Type: application/json
{
"name": "Updated Report",
"file": "{modified_json_report}"
}
A Python script can automate this process by retrieving the dataset schema, storing previous versions, detecting changes, and updating reports accordingly. The script starts by authenticating with Power BI and fetching the dataset schema:
import requests
import json
TENANT_ID = "your-tenant-id"
CLIENT_ID = "your-client-id"
CLIENT_SECRET = "your-client-secret"
WORKSPACE_ID = "your-workspace-id"
DATASET_ID = "your-dataset-id"
AUTH_URL = f"https://login.microsoftonline.com/{TENANT_ID}/oauth2/v2.0/token"
def get_access_token():
data = {
"grant_type": "client_credentials",
"client_id": CLIENT_ID,
"client_secret": CLIENT_SECRET,
"scope": "https://graph.microsoft.com/.default",
"resource": "https://analysis.windows.net/powerbi/api"
}
headers = {"Content-Type": "application/x-www-form-urlencoded"}
response = requests.post(AUTH_URL, data=data, headers=headers)
return response.json().get("access_token")
def get_dataset_schema():
headers = {"Authorization": f"Bearer {get_access_token()}"}
response = requests.get(f"https://api.powerbi.com/v1.0/myorg/datasets/{DATASET_ID}/tables", headers=headers)
return response.json()
This schema is compared with a stored previous version to detect renamed tables and columns. If changes are detected, the script exports affected reports, modifies table/column references, and re-imports the updated reports:
def export_report(report_id):
headers = {"Authorization": f"Bearer {get_access_token()}"}
response = requests.get(f"https://api.powerbi.com/v1.0/myorg/reports/{report_id}/export", headers=headers)
return response.json()
def update_report_json(report_json, changes):
updated_json = json.dumps(report_json)
for old_table, new_table in changes["renamed_tables"].items():
updated_json = updated_json.replace(old_table, new_table)
for table, col_map in changes["renamed_columns"].items():
for old_col, new_col in col_map.items():
updated_json = updated_json.replace(old_col, new_col)
return json.loads(updated_json)
def import_report(updated_json, report_name):
headers = {
"Authorization": f"Bearer {get_access_token()}",
"Content-Type": "application/json"
}
data = {"name": report_name, "file": json.dumps(updated_json)}
response = requests.post("https://api.powerbi.com/v1.0/myorg/reports/import", headers=headers, json=data)
return response.status_code
By automating schema monitoring and report updates, user impact can be minimized when unavoidable table or column name changes occur. Additionally, using SQL views or calculated columns with old names can help maintain stability. Let me know if you need further refinements or additional features like email notifications or Power Automate integration.
Best regards,
Hi @rdumont ,
Power BI Embedded users face report breakage when table or column names change in the semantic model. To prevent this, schema changes must be detected in advance and reports updated automatically.
Using the Power BI REST API, dataset schema can be extracted and compared with previous versions to detect changes. The Get Dataset Tables API fetches table and column structures:
GET https://api.powerbi.com/v1.0/myorg/datasets/{datasetId}/tables
Authorization: Bearer {access_token}
For deeper analysis in Premium/PPU workspaces, XMLA Endpoint DMV queries provide metadata:
SELECT * FROM $System.TMSCHEMA_COLUMNS
Once changes are detected, user reports can be updated by exporting the report JSON, replacing outdated references, and re-importing it:
GET https://api.powerbi.com/v1.0/myorg/reports/{reportId}/export
POST https://api.powerbi.com/v1.0/myorg/reports/import
Authorization: Bearer {access_token}
Content-Type: application/json
{
"name": "Updated Report",
"file": "{modified_json_report}"
}
A more seamless approach is using SQL views to maintain column aliases or creating duplicate calculated columns with old names before renaming. Automating schema checks and report updates ensures minimal user impact. Let me know if you’d like a script to automate this.
Best regards,
fantastic response, thank you!
We do use a view layer to further reduce impact however on occassion it will still happen. If you could provide a script to automate i would be delighted!
-Rob
Hi @rdumont ,
The best approach to prevent Power BI Embedded reports from breaking due to schema changes is to automate schema detection and report updates. Using the Power BI REST API, you can extract dataset schema and compare it with previous versions to detect renamed or deleted tables and columns. This can be done using the API endpoint:
GET https://api.powerbi.com/v1.0/myorg/datasets/{datasetId}/tables
Authorization: Bearer {access_token}
For Premium or PPU workspaces, XMLA Endpoint DMV queries provide deeper metadata insights:
SELECT * FROM $System.TMSCHEMA_COLUMNS
Once schema changes are detected, user reports can be updated by exporting the report JSON, replacing outdated references, and re-importing it. The export and import operations are performed using the following API endpoints:
GET https://api.powerbi.com/v1.0/myorg/reports/{reportId}/export
Authorization: Bearer {access_token}
POST https://api.powerbi.com/v1.0/myorg/reports/import
Authorization: Bearer {access_token}
Content-Type: application/json
{
"name": "Updated Report",
"file": "{modified_json_report}"
}
A Python script can automate this process by retrieving the dataset schema, storing previous versions, detecting changes, and updating reports accordingly. The script starts by authenticating with Power BI and fetching the dataset schema:
import requests
import json
TENANT_ID = "your-tenant-id"
CLIENT_ID = "your-client-id"
CLIENT_SECRET = "your-client-secret"
WORKSPACE_ID = "your-workspace-id"
DATASET_ID = "your-dataset-id"
AUTH_URL = f"https://login.microsoftonline.com/{TENANT_ID}/oauth2/v2.0/token"
def get_access_token():
data = {
"grant_type": "client_credentials",
"client_id": CLIENT_ID,
"client_secret": CLIENT_SECRET,
"scope": "https://graph.microsoft.com/.default",
"resource": "https://analysis.windows.net/powerbi/api"
}
headers = {"Content-Type": "application/x-www-form-urlencoded"}
response = requests.post(AUTH_URL, data=data, headers=headers)
return response.json().get("access_token")
def get_dataset_schema():
headers = {"Authorization": f"Bearer {get_access_token()}"}
response = requests.get(f"https://api.powerbi.com/v1.0/myorg/datasets/{DATASET_ID}/tables", headers=headers)
return response.json()
This schema is compared with a stored previous version to detect renamed tables and columns. If changes are detected, the script exports affected reports, modifies table/column references, and re-imports the updated reports:
def export_report(report_id):
headers = {"Authorization": f"Bearer {get_access_token()}"}
response = requests.get(f"https://api.powerbi.com/v1.0/myorg/reports/{report_id}/export", headers=headers)
return response.json()
def update_report_json(report_json, changes):
updated_json = json.dumps(report_json)
for old_table, new_table in changes["renamed_tables"].items():
updated_json = updated_json.replace(old_table, new_table)
for table, col_map in changes["renamed_columns"].items():
for old_col, new_col in col_map.items():
updated_json = updated_json.replace(old_col, new_col)
return json.loads(updated_json)
def import_report(updated_json, report_name):
headers = {
"Authorization": f"Bearer {get_access_token()}",
"Content-Type": "application/json"
}
data = {"name": report_name, "file": json.dumps(updated_json)}
response = requests.post("https://api.powerbi.com/v1.0/myorg/reports/import", headers=headers, json=data)
return response.status_code
By automating schema monitoring and report updates, user impact can be minimized when unavoidable table or column name changes occur. Additionally, using SQL views or calculated columns with old names can help maintain stability. Let me know if you need further refinements or additional features like email notifications or Power Automate integration.
Best regards,
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code FABINSIDER for a $400 discount!
Check out the February 2025 Power BI update to learn about new features.