Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
eddy1980
New Member

Artifact not found when querying a table in a lakehouse using notebook in Fabric

I am using the new lakehouse in fabric which support schema. When I run below code to query a table. I got below error message. I am able to open the table in the lakehouse and cannot query it in notebook.
Code in notebook:
df = spark.sql("SELECT * FROM Bronze_Lakehouse.dbo.Question ")
display(df)
 
Error:

Py4JJavaError: An error occurred while calling o324.sql. : com.microsoft.fabric.spark.metadata.DoesNotExistException: Artifact not found: workspaceABC.bronze_lakehouse

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @eddy1980,

In fact, I can run these code in both two lakehouses that enabled or not enabled schema: (I modify the code to remove the workspace and lakehouse prefix and setting the default lakehouse, invoke tables which existed in current default lakehouse)

 

df = spark.sql("SELECT * FROM Question")
display(df)

 

 

Snapshots:

1.png2.png

BTW, did the lakehouse sql endpoint succeed generated and sync the tables from lakehouse that you used? If not, I'd like to suggest you report to dev to help check the root cause and fix it more quickly.

Regards,

Xiaoxin Sheng

View solution in original post

8 REPLIES 8
mscollis
New Member

Hi @eddy1980. I know this is almost a year old, however I just ran into this exact same error this week and found a solution. My sandbox environment ran fine using the schema-enabled lakehouse, however when I ran the exact same code in dev on the schema-enabled lakehouse the code failed. Same naming conventions and everything across objects. I got the same "Artifact not found" error, shown here. The same code attached to a regular lakehouse (without schemas) worked just fine.

mscollis_0-1759380446264.png

Digging in a little more, I ran this script in the dev environment to try and figure if something looked funny in the environment and I saw that the default workspace had this leading space in the name. 

mscollis_1-1759380751584.png

As it turns out, somebody fat fingered the space in there when creating the workspace. There is a bug (or feature!) in the schema-enabled lakehouse that won't work with special characters, including spaces. This is actually mentioned near the top and also down at the bottom of the documentation for lakehouse schemas, which is somehow still in public preview after a year.

 

I verified this by looking at the name side-by-side with the sandbox environment. Sure enough, there was the space shifting it over slightly. I can't believe I didn't notice it earlier.

mscollis_2-1759381177683.png

I had our admin change the workspace name to remove the space, and bingo - the code ran fine afterwards.

 

I hope your problem has long since been solved, but in case you've been up at night randomly thinking about this problem from a year ago, maybe this will help!

 

 

eddy1980
New Member

Thanks @Anonymous . I ran your script and it didn't work. This issue only happened at our prod workspace. It works fine in uat and dev workspace. 

 

In addition, when I ran below script in the notebook. It also got same error.  Please see the screenshto below.

spark.sql("SHOW SCHEMAS").show(truncate=False)

 

eddy1980_0-1730946518934.png

 

I guess it is caused by setting of prod workspace. However, when I compared it between uat and prod. It is same. Could you please further advise? 

Anonymous
Not applicable

HI @eddy1980,

Any update on this? Did the above helps? If not, you can feel free to post here.

Regards,

Xiaoxin Sheng

hi @Anonymous , thanks for your following up on this. I couldn't resolve it. I had to create a lakehouse without the schema feature. 

Anonymous
Not applicable

Hi @eddy1980,

In fact, I can run these code in both two lakehouses that enabled or not enabled schema: (I modify the code to remove the workspace and lakehouse prefix and setting the default lakehouse, invoke tables which existed in current default lakehouse)

 

df = spark.sql("SELECT * FROM Question")
display(df)

 

 

Snapshots:

1.png2.png

BTW, did the lakehouse sql endpoint succeed generated and sync the tables from lakehouse that you used? If not, I'd like to suggest you report to dev to help check the root cause and fix it more quickly.

Regards,

Xiaoxin Sheng

Anonymous
Not applicable

Hi @eddy1980,

Current it seems not support directly switch from common lakehouse and schema lakehouse.
If you were working with common lakehouse tables, you can directly use pinned lakehouse table name without any other prefixed.

df = spark.sql("SELECT * FROM Question ")
display(df)

In addition, I'm not so recommend you to add any special characters in the lakehouse, schema and table names, they may affect the query usages.

Regards,

Xiaoxin Sheng

eddy1980
New Member

Thanks @Anonymous . I tried your script and doesn't work. This issue only happened at our prod workspace in the Fabric. It works fine in the UAT workspace.

 

In addition, if I run below script to show the schema of the lakehouse, I got same error "Artifact not found". Please see the screenshot below.

spark.sql("SHOW SCHEMAS").show(truncate=False)

 

I guest it could be caused by the setting of workspace or lakehouse. I compared the setting of workspace between prod and uat. They are same. Is there a command I can run in the notebook to enable schema of a lakehouse?

 

eddy1980_0-1730945819423.png

 

Anonymous
Not applicable

Hi @eddy1980,

I test with your script and it works well on my side. Have you setting the default Lakehouse of the notebook to quick reference this resource? In addition, you can try to directly use the 'table name' in query string if current table is hosted in the default dbo schema.

1.png

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors