Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
Anonymous
Not applicable

Fabric Notebook - write dataframe to lakehouse fails - 'DataFrame' object has no attri

Hi.

I try to write some data to a lakehouse.

Setup

  • An lakehouse named "Metadata"
  • A notebook
    • Gets some data
    • Writes to lakehouse

An error occurs when writing to the the lakehouse.

 

Reproducible example. A notebook

 

# Semantic link package used to retrive data.
import sempy
import sempy.fabric as fabric

workspaces_df = fabric.list_workspaces()

delta_table_name = "Workspaces"
workspaces_df.write.mode("overwrite").format("delta").saveAsTable(delta_table_name)

 

 

, returns error message

 

AttributeError: 'DataFrame' object has no attribute 'write'

 

 

I have tried
- converting the dataframe to a Spark dataframe

but not found a solution so far.

Does anyone have a working solution for saving the output from Seamntic Link to a lakehouse File or table?
If so, it would be much appreciated

Kind regards

 

1 ACCEPTED SOLUTION
blopez11
Super User
Super User

You definitely need to convert to a spark dataframe before writing.  Can you show the code you used where you converted to a spark dataframe?  I used spark.createDataFrame.

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Hi. Thanks for the reply  @blopez11 .

 

It works using the convertion to spark dataframe as you suggested.

 

The code belows works:

 

import sempy
import sempy.fabric as fabric

workspaces = fabric.list_workspaces()

# For simplicity, keep columns withouth " " in column names in order to avoid Error Invalid column name when writing
workspaces_subset = workspaces[['Type', 'Id']]

workspaces_subset_spark = spark.createDataFrame(workspaces_subset)

# Write to deault lakehouse
table_name = "workspaces_subset"
workspaces_subset_spark.write.format("delta").mode("Overwrite").saveAsTable(table_name)

 


The error I experienced was probably since the dataframe was unintentionally converted to Series before trying to convert to spark dataframe.

The unintentionnaly convertion to Series was due to selecting 1 column from the dataframe, i.e.

 

# Data type unintenionally change from pandas dataframe to Series
workspaces_series = workspaces_df['Id'] 

 

Thanks for the help!

blopez11
Super User
Super User

You definitely need to convert to a spark dataframe before writing.  Can you show the code you used where you converted to a spark dataframe?  I used spark.createDataFrame.

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.