Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
smpa01
Super User
Super User

Managed Delta Table Error

I am trying to create a managed table in lakehouse using NB with rows manually entered (SQL equivalent INSERT INTO) but I am getting this following error, i have no idea how to debug this. it seems to create the delta table without any columns

 

smpa01_1-1721451973020.png

 

%%pyspark 
from pyspark.sql import SparkSession 
from pyspark.sql.types import *
from pyspark.sql import functions as sf
from datetime import datetime

# Initialize Spark session 
spark = SparkSession.builder \
    .appName("session_one") \
    .getOrCreate()

schema = StructType([
    StructField('id',IntegerType(), True),
    StructField('schema_name', StringType(), True),
    StructField('table_name', StringType(), True),
    StructField('watermark_value', TimestampType(), True),
    StructField('full_path', StringType(), True)
])

row_one = [
    (1, 'lorem', 'ipsum', datetime(1, 1, 1, 0, 0, 0), None),
]

df_one = spark.createDataFrame(row_one, schema)
df_two = df_one.withColumn('full_path', sf.concat(sf.col('schema_name'),sf.lit('.'),sf.col('table_name')))

df_two.show()
df_two.write.format("delta").saveAsTable("watermark")

 

 

How can I satisfy `No Delta transaction log entries were found ` req

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
1 ACCEPTED SOLUTION
smpa01
Super User
Super User

This issue can be solved by using tablebuilder api

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

View solution in original post

3 REPLIES 3
smpa01
Super User
Super User

This issue can be solved by using tablebuilder api

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
frithjof_v
Super User
Super User

Or maybe this could work (I asked ChatGPT how to create a similar table with SQL syntax)

 

%%sql 

-- Step 1: Create the Table

CREATE TABLE watermark (

    id INT,

    schema_name VARCHAR(255),

    table_name VARCHAR(255),

    watermark_value TIMESTAMP,

    full_path VARCHAR(255)

);

 

-- Step 2: Insert Data into the Table

INSERT INTO watermark (id, schema_name, table_name, watermark_value, full_path)

VALUES (1, 'lorem', 'ipsum', '0001-01-01 00:00:00', NULL);

 

-- Step 3: Update the `full_path` Column

UPDATE watermark

SET full_path = schema_name || '.'

|| table_name;

frithjof_v
Super User
Super User

Does it work if you use this code below?

 

---------------------------------------------

 

from pyspark.sql.types import *

from pyspark.sql import functions as sf

from datetime import datetime

 

schema = StructType([

    StructField('id',IntegerType(), True),

    StructField('schema_name', StringType(), True),

    StructField('table_name', StringType(), True),

    StructField('watermark_value', TimestampType(), True),

    StructField('full_path', StringType(), True)

])

 

row_one = [

    (1, 'lorem', 'ipsum', datetime(1, 1, 1, 0, 0, 0), None),

]

 

df_one = spark.createDataFrame(row_one, schema)

df_two = df_one.withColumn('full_path', sf.concat(sf.col('schema_name'),sf.lit('.'),sf.col('table_name')))

 

df_two.show()

df_two.write.mode("overwrite").saveAsTable("watermark")

 

-----------------------------------------------

 

I don't think you need to specify %%pyspark as this is the default.

 

I don't think you need to initalize the spark session in your code in Fabric notebooks.

 

Maybe you need to add .mode("overwrite") or .mode("append") in the saveAsTable expression.

 

By the way, does your code run without errors if you remove line 28 in your code? (The saveAsTable line)

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.