Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
A SQL Endpoint in a lake house allow us to create schemas. The tables from the lakehouse are automatically in the DBO schema, but we can create new schemas.
Is there anyway to include the tables in custom schemas, either using notebooks or the UI ? If not, are we only able to include custom objects, such as views, in custom schemas ?
Kind Regards,
Dennes
Solved! Go to Solution.
@DennesTorres It seems to be same limitations that we use to have in Synpase analytics -- lake databases
Reference Text:
Lake databases allow creation of custom T-SQL objects, such as schemas, procedures, views, and the inline table-value functions (iTVFs). In order to create custom SQL objects, you MUST create a schema where you will place the objects. Custom SQL objects cannot be placed in dbo schema because it is reserved for the lake tables that are defined in Spark, database designer, or Dataverse.
Link :
https://learn.microsoft.com/en-us/azure/synapse-analytics/metadata/database
Hi @DennesTorres Spark doesn't support schemas, and Lakehouse, tables in the SQL endpoint are synchronized from the Spark catalog. So currently, not all lakehouse tables will appear in the dbo schema
Regards
Geetha
Schemas are available at the SQL endpoint. I transfered a table from dbo to a custom schema, which sounds like an extended property. Is there a way (or will be) to define the schema while creating the table in fabric? That's benfecial for permissions granted via SQL at schema level instead of each table.
Hi,
Could you provide more details about how you managed to transfer a table to a different schema and it you still could access this table on a notebook?
Kind Regards,
Denned
you can move a table from schema while using the SQL Point and querying via tsql (I did in SSMS). This looks to be nothing more than an extended property to the underlying delta table.
However, from the notebook you still querying using spark which doesnt support schema, as mentioned by the support team.
alter schema conformed transfer [dbo].[sales_internal]
select count(*) from [conformed].[sales_internal]
@DennesTorres It seems to be same limitations that we use to have in Synpase analytics -- lake databases
Reference Text:
Lake databases allow creation of custom T-SQL objects, such as schemas, procedures, views, and the inline table-value functions (iTVFs). In order to create custom SQL objects, you MUST create a schema where you will place the objects. Custom SQL objects cannot be placed in dbo schema because it is reserved for the lake tables that are defined in Spark, database designer, or Dataverse.
Link :
https://learn.microsoft.com/en-us/azure/synapse-analytics/metadata/database
@puneetvijwani We can create the custom SQL objects in the dbo schema as well, like views and procedure. We just can't create the function only.
User | Count |
---|---|
6 | |
2 | |
2 | |
2 | |
2 |
User | Count |
---|---|
18 | |
17 | |
6 | |
5 | |
4 |