Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi Team,
I have a requirement where my delta table resides in the bronze layer of a Lakehouse, and the silver layer is in the warehouse. I have created a stored procedure in the warehouse to perform an SCD Type 2 operation (with the source in the bronze layer and the target in the warehouse). As part of this process, I am attempting to create a temporary table at runtime (e.g., SELECT * INTO tempTable FROM table1) in the Lakehouse. However, I am encountering the following error: The external policy action 'Microsoft.Sql/Sqlservers/Databases/Schemas/Tables/Create' was denied on the requested resource.
I am unsure whether this is a limitation within Microsoft Fabric or if it's a permission issue, though I don’t believe because I am an admin of the Fabric workspace
Solved! Go to Solution.
Hello @Jaganath
@suparnababu8 is absolutely right
You can’t create or alter Lakehouse tables through the Lakehouse SQL endpoint; it’s read-only for T-SQL DDL statements. Even if you’re a workspace admin, table creation there is blocked. Instead, you could:
Use a Warehouse
3. If your goal is to handle staging or temp tables with T-SQL statements, do it in a Warehouse, where T-SQL CREATE TABLE or CREATE TEMPORARY TABLE operations are allowed.
2. Use Spark Notebooks
In a Lakehouse, DDL changes require Spark. For instance:
# Create a table in Lakehouse from a notebook
df.write.format("delta").mode("overwrite").saveAsTable("myLakehouseTable")
If you need a temporary table specifically to handle SCD logic, move that logic to a Warehouse or keep it in Spark so that creation of the intermediate table is permitted
hope this helps
thanks
Hello @Jaganath
@suparnababu8 is absolutely right
You can’t create or alter Lakehouse tables through the Lakehouse SQL endpoint; it’s read-only for T-SQL DDL statements. Even if you’re a workspace admin, table creation there is blocked. Instead, you could:
Use a Warehouse
3. If your goal is to handle staging or temp tables with T-SQL statements, do it in a Warehouse, where T-SQL CREATE TABLE or CREATE TEMPORARY TABLE operations are allowed.
2. Use Spark Notebooks
In a Lakehouse, DDL changes require Spark. For instance:
# Create a table in Lakehouse from a notebook
df.write.format("delta").mode("overwrite").saveAsTable("myLakehouseTable")
If you need a temporary table specifically to handle SCD logic, move that logic to a Warehouse or keep it in Spark so that creation of the intermediate table is permitted
hope this helps
thanks
Hi @Jaganath ,
The answer provided by @nilendraFabric @suparnababu8 has resolved your issue? If so, kindly mark the helpful reply as the accepted solution. This will help other community members with similar concerns find solutions more efficiently.
Thank you for your cooperation!"
Hi @Jaganath
I too faced the similar problem in lakehouse. In lakehouse SQL end point will not allow to create new tables in Lakehouse. It can be used for onyl querying the data.
Insted of using SQL you can try to create a new table with notebooks by using PySpark or SparkSQL. I tried in the same way it works for me.
You can go through the below blog, It'll help you to solve your issue.
Let me kknow if it works.
Thanks
User | Count |
---|---|
35 | |
15 | |
14 | |
9 | |
8 |
User | Count |
---|---|
50 | |
30 | |
26 | |
17 | |
11 |