Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
rafiqahmed
Frequent Visitor

Read and write to different lakehouses using notebook in same workspace

Hello,

My goal is to read data from one lakehouse table, clean the table and save in another lakehouse in same workspaces.

 

 

I am getting table not fund error:

df = spark.sql("SELECT * FROM GoldLakehouse.SalesInvoiceHeaders LIMIT 1000")
display(df)
rafiqahmed_1-1744029063753.png

 


 

 

1 ACCEPTED SOLUTION
rafiqahmed
Frequent Visitor

The problem is solved for me.

 

I created the second lakehouse using Lakehouse Schemas checkbox checked. 

rafiqahmed_1-1744033934467.png

That might be causing the issue, I deleted the lakehouse and created again with Lakehouse schemas checkbox unchecked. Open notebook , Now I can read from one lakehouse table and write to other lakehouse table in same workspace.

 

Thanks

 

View solution in original post

2 REPLIES 2
rafiqahmed
Frequent Visitor

The problem is solved for me.

 

I created the second lakehouse using Lakehouse Schemas checkbox checked. 

rafiqahmed_1-1744033934467.png

That might be causing the issue, I deleted the lakehouse and created again with Lakehouse schemas checkbox unchecked. Open notebook , Now I can read from one lakehouse table and write to other lakehouse table in same workspace.

 

Thanks

 

Anonymous
Not applicable

Hi @rafiqahmed,

 

Thank you for sharing your update and confirming that you dont have any issue. i request you to please accept your own post as the solution, this will help other community members who might face a similar issue.

 

Regards,

Vinay Pabbu

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.