Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Arnab0111
Helper I
Helper I

Save a pyspark dataframe in a table in warehouse using notebook

I am using notebook and have a pyspark dataframe , please guide me in saving the same as a overwrite mode table in warehouse inside a custom schema.

1 ACCEPTED SOLUTION
puneetvijwani
Resolver IV
Resolver IV

@Arnab0111 Would suggest if you can navigate to data enginnering space from bottom left and go to Use sample section ,you will see the data engeering starter kit , it has many examples from writing t& transformation which will guide you how you can save table as default in Table section in a Lakehouse and data prep from data engineering point of view , which will save the table in default catalog of lakehouse 

After that Open the Data warehouse and add your Lakehouse sql end point 

epunvij_0-1693315706438.png

 


Query the DBO Schema table from lakehouse and save as view in to your Custom Schma of your data warehouse

epunvij_1-1693315828900.png

 



View solution in original post

3 REPLIES 3
puneetvijwani
Resolver IV
Resolver IV

@Arnab0111 Would suggest if you can navigate to data enginnering space from bottom left and go to Use sample section ,you will see the data engeering starter kit , it has many examples from writing t& transformation which will guide you how you can save table as default in Table section in a Lakehouse and data prep from data engineering point of view , which will save the table in default catalog of lakehouse 

After that Open the Data warehouse and add your Lakehouse sql end point 

epunvij_0-1693315706438.png

 


Query the DBO Schema table from lakehouse and save as view in to your Custom Schma of your data warehouse

epunvij_1-1693315828900.png

 



puneetvijwani
Resolver IV
Resolver IV

@Arnab0111 As of now its not possible to save Pysparkdataframe to schema other than dbo ( if you choose to save them in Tables section for lakehouse) 

You can create a viw from your lakehouse to custom schema or use sql endpoint of lakehose and create view in your datawarehouse under Custom Schema 

Ps: If my response helped, kindly select it as the solution. Your kudos are greatly appreciated!

Can you please guide me in saving to default dbo schema only

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors