Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the OneLake & Platform Admin teams for an ask US anything on July 16th. Join now.

Reply
Arnab0111
Helper I
Helper I

Save a pyspark dataframe in a table in warehouse using notebook

I am using notebook and have a pyspark dataframe , please guide me in saving the same as a overwrite mode table in warehouse inside a custom schema.

1 ACCEPTED SOLUTION
puneetvijwani
Resolver IV
Resolver IV

@Arnab0111 Would suggest if you can navigate to data enginnering space from bottom left and go to Use sample section ,you will see the data engeering starter kit , it has many examples from writing t& transformation which will guide you how you can save table as default in Table section in a Lakehouse and data prep from data engineering point of view , which will save the table in default catalog of lakehouse 

After that Open the Data warehouse and add your Lakehouse sql end point 

epunvij_0-1693315706438.png

 


Query the DBO Schema table from lakehouse and save as view in to your Custom Schma of your data warehouse

epunvij_1-1693315828900.png

 



View solution in original post

3 REPLIES 3
puneetvijwani
Resolver IV
Resolver IV

@Arnab0111 Would suggest if you can navigate to data enginnering space from bottom left and go to Use sample section ,you will see the data engeering starter kit , it has many examples from writing t& transformation which will guide you how you can save table as default in Table section in a Lakehouse and data prep from data engineering point of view , which will save the table in default catalog of lakehouse 

After that Open the Data warehouse and add your Lakehouse sql end point 

epunvij_0-1693315706438.png

 


Query the DBO Schema table from lakehouse and save as view in to your Custom Schma of your data warehouse

epunvij_1-1693315828900.png

 



puneetvijwani
Resolver IV
Resolver IV

@Arnab0111 As of now its not possible to save Pysparkdataframe to schema other than dbo ( if you choose to save them in Tables section for lakehouse) 

You can create a viw from your lakehouse to custom schema or use sql endpoint of lakehose and create view in your datawarehouse under Custom Schema 

Ps: If my response helped, kindly select it as the solution. Your kudos are greatly appreciated!

Can you please guide me in saving to default dbo schema only

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.