Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
Recently we saw that MS has released Items - Refresh Sql Endpoint Metadata - REST API (SQLEndpoint) | Microsoft Learn as well its corresponding implementation on the semantic-labs library
https://github.com/microsoft/semantic-link-labs/wiki/Code-Examples#refresh-sql-endpoint-metadata
we have preferred to choose the implementation of the semantic-labs because its simplicity but after managing all the problems related with the libraries (they have to be include on a fabric environment ion order to allow the notebook be used on a pipeline) we notice that the notebook execution fails intermittebntly with following error message
Notebook execution failed at Notebook service with http status code - '200', please check the Run logs on Notebook, additional details
- 'Error name - KeyError, Error value - "['Table Name', 'Status', 'Start Time', 'End Time', 'Last Successful Sync Time'] not in index"' :
The Run logs of notebook does not give so much detail.
Any idea about what is going wrong here?
Thanks,
Alfons
Hi,
Add the time delay did not make any difference. It's quite clear that for some reasons call to API that manage trhe refresh of the SQL Endpoint fails but no idea why.
I have tested that scheduling the notebook to run at different times, sometimes works fine others not
Success execution\\
Failed execution
but I am not able to undertand what make the difference to make it fail. Same lakehouse, same tables,....
Alfons
Hi @alfBI ,
Thank you for reaching out to the Microsoft Community Forum.
The intermittent failures when using the refresh_sql_endpoint_metadata function from the semantic-link-labs library in Microsoft Fabric, particularly encountering a KeyError related to missing DataFrame columns.
Please refer below workarounds.
1. Validate DataFrame Columns Before Access. Add a check before accessing the columns
expected_cols = ['Table Name', 'Status', 'Start Time', 'End Time', 'Last Successful Time']
if all(col in x.columns for col in expected_cols):
display(x[expected_cols])
else:
print("Expected columns not found. DataFrame is likely empty.")
expected_cols = ['Table Name', 'Status', 'Start Time', 'End Time', 'Last Successful Sync
Note: This prevents the notebook from failing when the DataFrame is empty.
2. Make sure at least one table exists in the Lakehouse before triggering the refresh. You can add a pre-check using the semantic-link-labs API to list tables and confirm presence.
3. Wrap the refresh logic in a try-except block
try:
x = labs.refresh_sql_endpoint_metadata(item=item, type=type, workspace=workspace, tables=tables)
display(x)
except KeyError as e:
print(f"KeyError encountered: {e}")
Note: This helps log errors and optionally retry or skip execution
4. If you are using a multi-step ETL/ELT pipeline, consider forcing a sync of the T-SQL endpoint using Semantic link.
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
I missed to add that what is curious is that If I open a failed execution
and I rerun from the failed refresh
it works, so it looks like just after the ingestion of tables on lakehouse the API needs some time to notice that lakehouse has tables. I will try again addind a time activity (30 seconds) in front of the refresh
Hi @alfBI ,
Thank you for response. As you mentioned in your previous response, the notebook execution is success. You want to check the issue again by adding a time activity before the refresh. Once done your testing. Please do let us know if you have any further queries.
Regards,
Dinesh
Hi @alfBI ,
Thank you for reaching out to the Microsoft Community Forum.
The error message " KeyError, Error value - "['Table Name', 'Status', 'Start Time', 'End Time', 'Last Successful Sync Time'] not in index"'", This typically indicates that the notebook is trying to access columns in a DataFrame that don’t exist, because the Lakehouse is empty or the SQL Endpoint metadata has not been initialized properly.
Please check below things to fix the issue.
1. Before accessing columns in the notebook, check if the DataFrame contains the expected columns. Please refer below sample python script.
expected_cols = ['Table Name', 'Status', 'Start Time', 'End Time', 'Last Successful Sync Time']
if all(col in df.columns for col in expected_cols):
df = df[expected_cols]
else:
print("Expected columns not found. DataFrame is likely empty.")
2. Check that the Lakehouse has at least one table or object before triggering the SQL Endpoint refresh. An empty Lakehouse will cause the API to return an empty response.
3. Place the notebook execution in a try-except block and log errors to help with debugging. Please refer sample python code in try-except block.
try:
# notebook execution logic
except KeyError as e:
print(f"KeyError encountered: {e}")
# optionally skip or retry
I hope this information helps. Please do let us know if you have any further queries.
Regards,
Dinesh
HI v-dineshya,
Using semantic labs link the notebook code is extremely simple
#%pip install semantic-link-labs # Welcome to your new notebook # Type here in the cell editor to add code! import sempy_labs as labs item = 'Stage' # Enter the name or ID of the Fabric item type = 'Lakehouse' # Enter the item type workspace = 'a0ad263f-c689-480b-bcd2-cc1a5cc9169f' # Enter the name or ID of the workspace # Example 1: Refresh the metadata of all tables tables = None x = labs.refresh_sql_endpoint_metadata(item=item, type=type, workspace=workspace, tables=tables) display(x) |
honestly I have no idea about how to apply your workaround here
Thx
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.