Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
I want to read data from two lakehouses and write it to another lakehouse but during notebook run i've to set one default lakehouse so is there any possible way to do this.
 
					
				
		
Hi @gunjankabra ,
Thanks for using Fabric Community. Yes it is possible to read data from default and write data in another lakehouse.
Reading Data from Default Lakehouse -
df = spark.sql("SELECT * FROM gopi_lake_house.customer_table1 LIMIT 1000")
display(df)
Writing Data to different lakehoue -
df.write.format("delta").saveAsTable("gopi_test_lakehouse.sales_external")
Hope this is helpful. Please let me know incase of further queries.
i want to read data from two different lakehouses in single notebook and write to another
Hi @gunjankabra ,
Please correct if my understanding is wrong.
As per my understanding, you can read tables from different lakehouse in single notebook and can save it as table in different lakehouse.
Note: Default Lakehouse is gopi_bronze here.
Hope this is helpful. Please let me know incase of further queries.
Hello @gunjankabra ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
Otherwise, will respond back with the more details and we will try to help . 
Hi @gunjankabra ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
 
					
				
				
			
		
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.
