Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
DebbieE
Community Champion
Community Champion

Fabric I have created a Dataframe in Notebook using pyspark. now I want to create a Delta PARQUET

Fabric I have created a Dataframe in Notebook using pyspark. now I want to create a Delta PARQUET

 

And I assume it is going into here

Tables.png

I use the Code:

 

from delta.tables import DeltaTable

dfcont.write.format("parquet").mode("overwrite").save("Data/silver/parquet_table")
 
Which is where I would like the PARQUET file to sit but its just throwing errors
 
for example
 
Py4JJavaError: An error occurred while calling o6222.save. : Operation failed: "Bad Request", 400, HEAD, http://onelake.dfs.fabric.microsoft.com/cb0d4c81-7750-4e3b-9192-605359abb083/da5a0bf1-6d37-4baa-a7d6...
 
I am having an issue with this coming up too
NotebookOpeningOnnotdefaultIssue.png
When I go to data sources Lake houses existing Lakehouses. The Lakehouse containing the data is greyed out I cant select it. im wondering if thats part of the issue?
 
This is the last step and it would be great to create my first PARQUET File
1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @DebbieE ,

Inorder to create table, you need use the below code.. 

Method 1:

 

df.write.mode("overwrite").format("delta").saveAsTable("abc5")

 



vgchennamsft_0-1712742498339.png

In your case, you just missed to include quotes.

Method 2:

vgchennamsft_1-1712743126423.png


Hope this is helpful. Please let me know incase of further queries.



View solution in original post

5 REPLIES 5
smoqt
Advocate I
Advocate I

Depending on your desired result, there are different methods.  The method you are using will write a parquet file to the Files location.  If you want to create a Delta table, you should use the saveAsTable function with "delta" as the format. 

 

Keep in mind that table names can only contain alphanumeric characters and underscores.

 

From the documentation:

 

# Keep it if you want to save dataframe as CSV files to Files section of the default Lakehouse

df.write.mode("overwrite").format("csv").save("Files/ " + csv_table_name)

# Keep it if you want to save dataframe as Parquet files to Files section of the default Lakehouse

df.write.mode("overwrite").format("parquet").save("Files/" + parquet_table_name)

# Keep it if you want to save dataframe as a delta lake, parquet table to Tables section of the default Lakehouse

df.write.mode("overwrite").format("delta").saveAsTable(delta_table_name)

# Keep it if you want to save the dataframe as a delta lake, appending the data to an existing table

df.write.mode("append").format("delta").saveAsTable(delta_table_name)

 

 

DebbieE
Community Champion
Community Champion

I am confused with this

 

df.write.mode("overwrite").format("delta").saveAsTable(delta_table_name)

NameError: name 'DimContestant' is not defined It doesnt exist as its beand new so this doesnt work for me

 

Ahhhh I found out you have to do this

("DimContestant")
 
as an aside. Is there anyway you can include a file path or do they all just go straight into the table folder with no file path mentioned?
 
 
 
Anonymous
Not applicable

Hi @DebbieE ,

Inorder to create table, you need use the below code.. 

Method 1:

 

df.write.mode("overwrite").format("delta").saveAsTable("abc5")

 



vgchennamsft_0-1712742498339.png

In your case, you just missed to include quotes.

Method 2:

vgchennamsft_1-1712743126423.png


Hope this is helpful. Please let me know incase of further queries.



Anonymous
Not applicable

Hi @DebbieE ,

Glad to know that your query got resolved. Please continue using Fabric Community on your further queries.

Anonymous
Not applicable

Hi @DebbieE ,

We haven’t heard from you on the last response and was just checking back to see if your query was answered. Otherwise, will respond back with the more details and we will try to help.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.