Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
amty63
Helper III
Helper III

Unable to load & create the table in the silver lakehouse

Hello Experts,
I am trying to read the table and renaming of few columns through SQL and trying to load & create the table in the silver lakehouse.
Below is the code using through notebooks in the Fabric
I am trying to fetch the data from bronze layer (staging schema) and need to create & load the table with same data but changing of names. Please suggest.

# Import lib
from pyspark.sql import SparkSession

spark = SparkSession.builder \
    .appName("DataFrame Write") \
    .getOrCreate()

df = spark.sql("""
SELECT
    ORDER_ID,
    TOOLPURCHASE_ID,
    SOC_ID AS CORP_ID,
    STATUS_ID,
    CURRENCY_ID,
    COLSID_CREATION AS EMPID_CREATION,
    ETS_ID AS EST_ID,
    COLSID_REQUESTER AS EMPID_REQUESTER,
    ORDER_IDEXTERNAL,
    ORDER_DATE,
    ORDER_AMOUNT,
    ORDER_MODIFICATIONDATE AS ORDER_MODIFICATION_DATE,
    ORDER_DESCRIPTION AS ORDER_DESCR,
    ORDER_DELIVERYDATEASKED
FROM
    ODS_PURCHASE_ORDER
""")

df.show()

# Saving to the Silver Lakehouse
encoded_url = "https%3A%2F%2Fonelake.dfs.fabric.microsoft.com%2FWKS%20POC%20FABRIC%20ORBITALSHIFT%20UAT%2FLKH_PROCUREMENT_UAT_Silver.Lakehouse%2FTables%2FNotebook_DWH_PROCUREMENT_PURCHASE_ORDER%2F"
df.write.mode("overwrite").format("delta").save(encoded_url)


Error >>>
Py4JJavaError Traceback (most recent call last) Cell In[41], line 32 30 # Saving to the Silver Lakehouse 31 encoded_url = "https%3A%2F%2Fonelake.dfs.fabric.microsoft.com%2FWKS%20POC%20FABRIC%20ORBITALSHIFT%20UAT%2FLKH_PROCUREMENT_UAT_Silver.Lakehouse%2FTables%2FNotebook_DWH_PROCUREMENT_PURCHASE_ORDER%2F" ---> 32 df.write.mode("overwrite").format("delta").save(encoded_url)
5 REPLIES 5
amty63
Helper III
Helper III

@v-nikhilan-msft 
thanks for the reply,
I know the workaround and could do through Dataflow Gen2 & Data Pipelines but requirement is to use & achieve it through notebooks.
is there any possibility through code & create the table in the silver lakehouse ?

Hi @amty63 

Currently achieving this through notebooks is not possible, because we can have only one default lakehouse for a notebook. 

Hope this helps.

Hi @amty63 
We haven’t heard from you on the last response and was just checking back to see if your query got resolved. Otherwise, will respond back with the more details and we will try to help.
Thanks

Hi @amty63 
We haven’t heard from you on the last response and was just checking back to see if your query got resolved. Otherwise, will respond back with the more details and we will try to help.
Thanks

v-nikhilan-msft
Community Support
Community Support

Hi @amty63 
Thanks for using Fabric Community.
Reading data from one lakehouse and creating a table using the same data in another lakehouse is not possible in Notebooks. The reason is there can be only one default lakehouse for a notebook in the current spark session. Instead you can use a dataflow and create a new table in another lakehouse. I have created a repro for you and attached the screenshots below:

I have a table named Covid_table in the Source Lakehouse. The schema is as below:

vnikhilanmsft_0-1710748063120.png


2) Create a Dataflow Gen2 and select the source as Lakehouse. Select the required lakehouse and the table.

vnikhilanmsft_1-1710748188246.png

 

vnikhilanmsft_2-1710748291026.png

 

vnikhilanmsft_3-1710748349157.png


After the data is successfully imported, you can do the required changes. You can remove the columns.

vnikhilanmsft_4-1710748447070.png


You can rename the columns:

vnikhilanmsft_5-1710748483858.png


After doing the changes, select the destination as the silver lakehouse.

vnikhilanmsft_6-1710748556823.png

 

vnikhilanmsft_7-1710748570649.png

Give the table a new name:

vnikhilanmsft_8-1710748617687.png

 

vnikhilanmsft_9-1710748631210.png


Save the settings and click on Publish. A new table will be created in the Lakehouse. 

 

vnikhilanmsft_10-1710748791906.png


Hope this helps. Please let me know if you have any further questions.



Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

AugFabric_Carousel

Fabric Monthly Update - August 2024

Check out the August 2024 Fabric update to learn about new features.

August Carousel

Fabric Community Update - August 2024

Find out what's new and trending in the Fabric Community.