Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Poweraegg
Advocate IV
Advocate IV

Fabric - Moving Data in Medallion Architecure between Workspaces

Hi,

 

I am trying to load and transform data from the Lakehouse in the Silver Layer Workspace to a Warehouse in the Gold Layer Workspace. However, I only seem to be able to use a Gen2 Dataflow for this. I cannot use a Pipeline Copy Data Activity, and I cannot seem to figure out how to use an SQL query in the Warehouse to load and transform the data using either the ABFS delta table path or DB name. I also tried writing the data into the Warehouse using the Lakehouse Notebook, but I get a PUT operation error. Any idea how to load and transform the data from a Silver Layer Lakehouse Workspace to a Gold Layer Warehouse Workspace without using a dataflow? Reasoning: In the dataflow I only have append or replace. I cannot make operations that perform a duplicates test before loading.

 

Thanks

4 REPLIES 4
Anonymous
Not applicable

I faced same issue while doing a merge query to warehouse table path from silver to gold using notebook. Is there a solution for this? Kindly provide if any.

Poweraegg
Advocate IV
Advocate IV

from delta.tables import *
from pyspark.sql.types import *
from pyspark.sql.functions import *

# Define the path
silver_path = "abfss://Contoso_Silver@onelake.dfs.fabric.microsoft.com/Contoso_Silver_LH.Lakehouse/Files/Product_Silver_External"
gold_path = "abfss://Contoso_Gold@onelake.dfs.fabric.microsoft.com/WH_Contoso_Gold.Datawarehouse/Tables/dbo/Products_Gold"

# Read the Delta table
deltaTable = DeltaTable.forPath(spark, silver_path)

# Convert the Delta table to a DataFrame
df_silver = deltaTable.toDF()

# Cast the 'Unit_Cost' and 'Unit_Price' columns to match the types in the Gold Warehouse table
df_silver = df_silver.withColumn("Unit_Cost", col("Unit_Cost").cast(DecimalType(18,0)))
df_silver = df_silver.withColumn("Unit_Price", col("Unit_Price").cast(DecimalType(18,0)))

# Write the data to the Gold Warehouse
df_silver.write.format("delta").mode("overwrite").save(gold_path)
 
Output - 
ERROR: Caused by: Operation failed: "Bad Request", 400, PUT, http://onelake.dfs.fabric.microsoft.com/Contoso_Gold/WH_Contoso_Gold.Datawarehouse/Tables/dbo/Produc... OperationNotAllowedOnThePath, "PUT call is not allowed on Tables path"

I think the issue is you can only write to warehouse tables through the sql end point. I'll have a play and get back to you.



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!
bcdobbs
Community Champion
Community Champion

Struggling to find it now but cross workspace data copy in a pipeline is not yet supported. (Sure it is on the roadmap).


Shortcut I think is an option.

 

Can you share the notebook code that is throwing the error?



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.