Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
DennesTorres
Impactful Individual
Impactful Individual

Schema on lakehouse sql endpoint

Hi,

A SQL Endpoint in a lake house allow us to create schemas. The tables from the lakehouse are automatically in the DBO schema, but we can create new schemas.

Is there anyway to include the tables in custom schemas, either using notebooks or the UI ? If not, are we only able to include custom objects, such as views, in custom schemas ?

 

Kind Regards,

 

Dennes

1 ACCEPTED SOLUTION
puneetvijwani
Resolver IV
Resolver IV

@DennesTorres  It seems to be same limitations that we use to have in Synpase analytics -- lake databases 
Reference Text:

Custom SQL objects in lake databases

Lake databases allow creation of custom T-SQL objects, such as schemas, procedures, views, and the inline table-value functions (iTVFs). In order to create custom SQL objects, you MUST create a schema where you will place the objects. Custom SQL objects cannot be placed in dbo schema because it is reserved for the lake tables that are defined in Spark, database designer, or Dataverse.

 

Link :
https://learn.microsoft.com/en-us/azure/synapse-analytics/metadata/database 

View solution in original post

6 REPLIES 6
GeethaT-MSFT
Microsoft Employee
Microsoft Employee

Hi  @DennesTorres Spark doesn't support schemas, and Lakehouse, tables in the SQL endpoint are synchronized from the Spark catalog.  So currently, not all lakehouse tables will appear in the dbo schema

 

Regards

Geetha

 

Schemas are available at the SQL endpoint. I transfered a table from dbo to a custom schema, which sounds like an extended property. Is there a way (or will be) to define the schema while creating the table in fabric? That's benfecial for permissions granted via SQL at schema level instead of each table.

Hi,

 

Could you provide more details about how you managed to transfer a table to a different schema and it you still could access this table on a notebook?

 

Kind Regards,

 

Denned

you can move a table from schema while using the SQL Point and querying via tsql (I did in SSMS). This looks to be nothing more than an extended property to the underlying delta table.

 

However, from the notebook you still querying using spark which doesnt support schema, as mentioned by the support team.

 

alter schema conformed transfer [dbo].[sales_internal]

select count(*) from [conformed].[sales_internal]

puneetvijwani
Resolver IV
Resolver IV

@DennesTorres  It seems to be same limitations that we use to have in Synpase analytics -- lake databases 
Reference Text:

Custom SQL objects in lake databases

Lake databases allow creation of custom T-SQL objects, such as schemas, procedures, views, and the inline table-value functions (iTVFs). In order to create custom SQL objects, you MUST create a schema where you will place the objects. Custom SQL objects cannot be placed in dbo schema because it is reserved for the lake tables that are defined in Spark, database designer, or Dataverse.

 

Link :
https://learn.microsoft.com/en-us/azure/synapse-analytics/metadata/database 

Anonymous
Not applicable

@puneetvijwani We can create the custom SQL objects in the dbo schema as well, like views and procedure. We just can't create the function only. 

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.