Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Score big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount

Reply
AyusmanBasu0604
Regular Visitor

Unable to call a Temp View in Fabric Notebook using Spark SQL

I am trying to refer a temp view that I have created but unable to do so even using the catalog schema: global_temp

statement:

%%sql
CREATE OR REPLACE TEMP VIEW temp_holidays AS
SELECT * FROM dbo.holidays LIMIT 10;

When I try to call the view within the same session or even notebook cell: 
SELECT * FROM temp_holidays

I keep getting various errors like: 
1. java.io.IOException: Invalid input length 3
2. java.io.IOException: Unrecognized character: z

I have tired something like:
from pyspark.sql import SparkSession
spark.sql("create or replace temporary view temp_view_1 as select 1 as colA")
df = spark.sql(f"select * from temp_view_1")
df.show()

This gives me the output, but the moment I create a view out of a valid lakehouse table and try to refer it like below:
from pyspark.sql import SparkSession
spark.sql("create or replace temporary view temp_grades as select * from Bronze.tbl_grades")
df = spark.sql(f"select * from temp_grades")
df.show()
I get the error: IllegalArgumentException: java.io.IOException: Unrecognized character: z

is it just because Spark Views are not supported in Lakehouse as mentioned here? https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-schemas#public-preview-limitatio... 

Any workarounds to this?
Purpose: I need to create a temp view/table on the fly so that I can store the old table/view data into the temp one and then perform transformations and re-create a table/view with original name. I can't use a dataFrame here simply because my actual View is a MATERIALIZED LAKE VIEW and it doesn't support creation from a dataFrame.




5 REPLIES 5
v-echaithra
Community Support
Community Support

Hi @AyusmanBasu0604 ,

May I ask if you have resolved this issue? Please let us know if you have any further issues, we are happy to help.

Thank you.

v-echaithra
Community Support
Community Support

Hi @AyusmanBasu0604 ,

I hope the information provided is helpful.I wanted to check whether you were able to resolve the issue with the provided solutions.Please let us know if you need any further assistance.

Thank you.

HarishKM
Memorable Member
Memorable Member

@AyusmanBasu0604 Hey,
The error is related to limitations regarding Spark views in Lakehouse environments,

I will follow up below steps to verify info

1)  Ensure that the schema and table formats are compatible with Spark.

2) Sometimes, issues arise from unsupported data types or formats in the Lakehouse environment.


3) instead of creating a temporary view, you might consider using a global temporary view, which persists across all sessions until the Spark application terminates:
Try this code - 

from pyspark.sql

import SparkSession spark.sql("CREATE OR REPLACE GLOBAL TEMP VIEW temp_grades AS SELECT * FROM Bronze.tbl_grades") df = spark.sql("SELECT * FROM global_temp.temp_grades") df.show()

 

4) If you can't use views, consider caching the DataFrame. This approach won't directly replace a materialized view but can improve performance

df = spark.sql("SELECT * FROM Bronze.tbl_grades") df.cache() # Perform transformations df.show()

 

5) If the only reason to avoid DataFrames is due to materialized views, consider saving the transformed DataFrame to a temporary location and then recreating the view/table. While this approach uses more I/O, it might work:

 

df = spark.sql("SELECT * FROM Bronze.tbl_grades") # Perform transformations df.write.mode("overwrite").save("/path/to/temp/location") # Load back from temp location into a new table/view spark.sql("CREATE OR REPLACE VIEW new_table AS SELECT * FROM delta.`/path/to/temp/location`")

 

 

Thanks

Harish M

 

First of all thanks for your detailed suggestions @HarishKM 
point# 3 - With Global temp views also the issue is same as I had tired this multiple times earlier.

point# 5 - I tried saving the dataframe into a temp location, while that worked but while calling, I got the error: Py4JJavaError: An error occurred while calling o348.sql.
: java.lang.AssertionError: assertion failed: Only the following table types are supported: MANAGED, MATERIALIZED_LAKE_VIEW

I tried using a Materialized lake view and it comes back to the original error which I was getting while trying to reference a global/temp view directly into a Materialized Lake View: [TABLE_OR_VIEW_NOT_FOUND]

Hi @AyusmanBasu0604,

Thank you for your response, you're encountering known limitations in Microsoft Fabric's Lakehouse when working with temporary views and Materialized Lake Views (MLVs) using Spark SQL.
Reference: Lakehouse schemas (Preview) - Microsoft Fabric | Microsoft Learn

Fabric Lakehouse does not support reading from temporary or global temporary views in certain operations especially when working with Materialized Lake Views, which have stricter limitations, referencing views created using Spark SQL in another SQL statement or notebook cell or trying to use TEMP VIEW or GLOBAL TEMP VIEW in MLV creation or inside CREATE OR REPLACE VIEW.

You can try this workaround, Use a Staging Delta Table, instead of using TEMP VIEW, create a real physical Delta table (possibly under a temporary/staging schema), then refer to it. Then, use it in your MLV. This works because you’re using a physical Delta table. Spark and Fabric support referencing it in MLVs.

Hope this helps.
Best Regards
Chaithra E.

Helpful resources

Announcements
August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.