Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
DebbieE
Community Champion
Community Champion

Fabric Notebook with a default lakhouse (Silver) and another lakehouse mounted (Gold)

I am creating a dataframe with data from silver lakehouse so this is the default one. dev_silver_lk

But I want to create a dimension Delta Parquet file in the gold lakehouse dev_gold_lk

I have also mounted this Data lake to the notebook (They both are in public preview as I have Schemas ticked)

 

And I just cant seem to do it? 

Surely this should be possible? 

 

I suspect it might be because my Lakehouse has Preview schemas? I tried this and it didnt work in Pyspark Notebook

 

gold_lh = "dev_timesheet_gold_lh"
target_path = gold_lh + "/Tables/dimCustomer"
 
from delta.tables import DeltaTable

dfBlockersKey_reordered.write.format("delta").mode("overwrite").save(target_path)

But I just get Py4JJavaError: An error occurred while calling o17332.save. : Operation failed: "Bad Request", 400, HEAD,
1 REPLY 1
Liam_McCauley
Frequent Visitor

You could try saveAsTable:

 

dfBlockersKey_reordered.write \
    .format("delta") \
    .mode("overwrite") \
    .saveAsTable("dev_timesheet_gold_lh.dimCustomer")

 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.