Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hi All.
I'm unsure if this is the right place to ask my question, so apologies if it is not.
Using PySpark
I've got some code that writes a dataframe to a delta table in an attached lakehouse.
spark_df.write.format("delta") \
.mode("overwrite") \
.saveAsTable(delta_table_name)
After... I try and read that data via
read_df = http://spark.read.format("delta").table(delta_table_name)
And I get some weird stuff back.
DataFrame[Event_Class: string, Event_Subclass: string, Current_Time: timestamp, Text_Data: string, Start_Time: timestamp, End_Time: timestamp, Duration: bigint, Cpu_Time: bigint, Success: string, Integer_Data: bigint, Object_ID: string, Table_Name: string, Partition_Name: string, Start: double, End: double]
It's not until I restart the kernel and rerun the read to I get actual rows and columns back.
What's going on? How can I ensure I get data back from my query without a kernel restart (or session stop/start)?
- David
Solved! Go to Solution.
I believe I have solved this.
I had the following in the code:
from IPython.display import display
This meant that when I called display(df) it gave me some output that was unexpected.
When I stopped and started the kernel and ran a separate code block to "do the same thing" the import was not rerun and I got different results.
Learning...
I believe I have solved this.
I had the following in the code:
from IPython.display import display
This meant that when I called display(df) it gave me some output that was unexpected.
When I stopped and started the kernel and ran a separate code block to "do the same thing" the import was not rerun and I got different results.
Learning...
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.