Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
I'm trying to create a reusable table template to create an empty table on a lakehouse using a Spark notebook.
I'm trying this pyspark code:
from pyspark.sql.types import StructType, StructField, StringType, IntegerType, DateType, TimestampType
#from pyspark.sql.types import VarcharType, CharType
from datetime import datetime
#table template
schema = StructType([
StructField("code", StringType(), True), # string field
StructField("description", StringType(), True), # string field
StructField("revenue", IntegerType(), True), # integer field
StructField("insert_date", DateType(), True), # date field
StructField("insert_datetime", TimestampType(), True) # datetime field
])
If possible, I'd like to use also the VarcharType() and the CharType, specifying a dimension. It should be useful specifying a dimension also for the IntegerType(). I've accomplish some proofs and it seems that the VarcharType() and the CharType() with a specified dimensione aren't supported.
Any suggests to me to complete this template? Thanks
Solved! Go to Solution.
This article may be relevant for this topic: https://blog.gbrueckl.at/2024/01/using-varchar-in-microsoft-fabric-lakehouses-and-sql-endpoints/
This article may be relevant for this topic: https://blog.gbrueckl.at/2024/01/using-varchar-in-microsoft-fabric-lakehouses-and-sql-endpoints/
Hi, thanks your suggest is good and interesting.
I'd like to use pyspark to solve this issue.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!