Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
sdmj
Frequent Visitor

Mirrored Databricks Catalog - Semantic Models Don't Receive Schema Updates

Hey All,

 

We have a data platform running in databricks and to read that data into Power BI, we're experimenting with the Mirrored Databricks catalog in Fabric. At first things seemed to work fine, all changes in databricks are quite quickly visible in Fabric and the reports.

 

But trouble started as soon as we move away from the default semantic model. Schema changes aren't automatically visible like they are in the default semantic model. The only way I found to to update the schema was to open the model in Power BI Desktop and refresh it there. 
While that worked yesterday, today I'm getting an error message:
REQUEST_LIMIT_EXCEEDED","message":"Error in Databricks Table Credential API. Your request was rejected since your organization has exceeded the rate limit. Please retry your request later."

I'm using a very limited model with only 4 tables so I can't really see where this is going wrong.

 

Can anyone help me identify why I'm getting this error and how I can avoid it? 

 

Thanks in advance!

 

1 ACCEPTED SOLUTION
sdmj
Frequent Visitor

So it seems like turning off the automatic sync helped.
We've not had issues since then. Following up on the audit log of databricks (query table system.access.audit ) also shows a lot fewer events for this user.

Manual refreshes of any model still seem to be very intensive on API-calls, so when working with multiple developers, that's definitely something to keep in mind.

 

To turn off the automatic sync for the default model, you need to go to the settings of the SQL Analytics Endpoint, then default Semantic Model. 

View solution in original post

4 REPLIES 4
sdmj
Frequent Visitor

So it seems like turning off the automatic sync helped.
We've not had issues since then. Following up on the audit log of databricks (query table system.access.audit ) also shows a lot fewer events for this user.

Manual refreshes of any model still seem to be very intensive on API-calls, so when working with multiple developers, that's definitely something to keep in mind.

 

To turn off the automatic sync for the default model, you need to go to the settings of the SQL Analytics Endpoint, then default Semantic Model. 

sdmj
Frequent Visitor

I've been digging around in the databricks logs and it looks like the API is completely saturated with requests since I turned on the automatic sync for the default model.

I have now turned this off and will give it some time to see if this solves this issue.

Curious if this resolved your issue, we have started recevining the same message over the last couple of days for a semantic model that has been working fine for weeks, no recent changes whatsoever. Also default semantic models for databricks mirroring catalogs and lakehouse are grayed out so I am not sure how to disable automatic sync for them, for the semantic model itself (non-deafult one) we had the refresh disabled from the beginning so I am quite unsure what is going on here.
Please keep us posted if you make any progress!

nilendraFabric
Community Champion
Community Champion

Hi @sdmj 

 

Custom semantic models do not automatically receive schema updates from the Mirrored Databricks Catalog. Manual refreshes are required, and repeated or concurrent refreshes can trigger Databricks API rate limits, resulting in REQUEST_LIMIT_EXCEEDED errors. To avoid this, monitor and coordinate API usage, use batching and backoff strategies, and consider automating updates with Databricks Workflows

 

 

Wait before retrying. The rate limit is usually reset after a short period (often a minute or so, but this can vary).
• Avoid repeatedly refreshing the semantic model in quick succession.

Even with a small model, certain operations—such as schema refreshes, credential validations, or repeated attempts to update the model—can collectively trigger rate limits, especially if multiple users or automated processes are accessing Databricks resources at the same time.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.