Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.

Reply
YashRaj5
Regular Visitor

Forbidden Issue with fabric sql endpoint tables

i have multiple tables on a lakehouse and suddenly from yesterday they started to show this error

YashRaj5_0-1764041330657.png

there are few tables which are showing this error at the lakehouse and they are not even showing in the endpoint. but when i am trying to read the same table from the lakehouse instead of sql endpoint using notebook with spark, i am able to read the successfully. I am having Admin access to the workspace.

Things tried so far.

* it tried to create shortcut on another lakehouse in completely different workspace but pointing to the same location but still it shows same error.

* to check whether it is data issue, i also copied the data to another lakehouse and then created shortcut on top of it, then it is working fine.

seems like some issue with the lakehouse, but if it was happening with few tables then it should have happened with other tables also.

1 ACCEPTED SOLUTION
AmiGarala
Frequent Visitor

Hi @YashRaj5 

It looks like a SQL Endpoint sync / OneLake permissions issue, not a data issue. The tables still exist in the Lakehouse (since Spark can read them), but the SQL Analytics Endpoint is unable to index or access them, which results in the 403 Forbidden error and missing tables in the endpoint.

 

Recommended Fixes / Workarounds

  1. Validate OneLake folder permissions
    Make sure your identity has at least ReadAll on the specific table folders.
    SQL Endpoint uses these permissions even if you are Workspace Admin.

  2. Trigger Lakehouse → SQL Endpoint metadata refresh

    • Refresh the SQL Analytics Endpoint in the Fabric UI

    • Or open the Lakehouse and run a small Spark notebook write (this often forces a sync)

  3. Check table folder structure
    Confirm the problematic tables are under:

     
    /Tables/<table_name>
     

    If they were manually written under /Files, SQL endpoint will not pick them up.

  4. Recreate the table metadata
    As you tested, copying data to another lakehouse or recreating a shortcut works, which confirms metadata corruption.
    You can also try recreating the table in the same lakehouse:

     
    df.write.format("delta").mode("overwrite").saveAsTable("table_name")
     
  5. If the issue persists, log a Fabric support ticket
    This issue matches an active Fabric bug where a few Delta tables fail to sync to SQL endpoints while others work fine.

View solution in original post

3 REPLIES 3
v-priyankata
Community Support
Community Support

Hi @YashRaj5 

Thank you for reaching out to the Microsoft Fabric Forum Community.

@AmiGarala Thanks for the inputs

I hope the information provided by user was helpful. If you still have questions, please don't hesitate to reach out to the community.

 

AmiGarala
Frequent Visitor

Hi @YashRaj5 

It looks like a SQL Endpoint sync / OneLake permissions issue, not a data issue. The tables still exist in the Lakehouse (since Spark can read them), but the SQL Analytics Endpoint is unable to index or access them, which results in the 403 Forbidden error and missing tables in the endpoint.

 

Recommended Fixes / Workarounds

  1. Validate OneLake folder permissions
    Make sure your identity has at least ReadAll on the specific table folders.
    SQL Endpoint uses these permissions even if you are Workspace Admin.

  2. Trigger Lakehouse → SQL Endpoint metadata refresh

    • Refresh the SQL Analytics Endpoint in the Fabric UI

    • Or open the Lakehouse and run a small Spark notebook write (this often forces a sync)

  3. Check table folder structure
    Confirm the problematic tables are under:

     
    /Tables/<table_name>
     

    If they were manually written under /Files, SQL endpoint will not pick them up.

  4. Recreate the table metadata
    As you tested, copying data to another lakehouse or recreating a shortcut works, which confirms metadata corruption.
    You can also try recreating the table in the same lakehouse:

     
    df.write.format("delta").mode("overwrite").saveAsTable("table_name")
     
  5. If the issue persists, log a Fabric support ticket
    This issue matches an active Fabric bug where a few Delta tables fail to sync to SQL endpoints while others work fine.

Thanks for the prompt response.

But all the steps you have mentioned above were already implemented, and I am also Admin on the workspace where I am facing the issue, so not sure the permissions will be applied in that case.

 

But eventually we found that there was an option called 'Data Access Mode' at the SQL endpoint level that was somehow causing issue, after some trial and error it started working. Thanks 

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors