Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I've tried to create a new table in a lakehouse at Silver layer by using a copy activity in a data pipeline in order to import data from a csv file, but it seems that the table couldn't be created.
Using a dataflow gen2 it is possible to create a new table in a lakehouse without using a notebook Spark.
Now, is there a manner to create a new lakehouse table by a data pipeline without using a notebook Spark? Thanks
Solved! Go to Solution.
Not sure what you mean by new table couldnt be created.
In copy activity, under destination :
sink output :
Not sure what you mean by new table couldnt be created.
In copy activity, under destination :
sink output :
Hi, thanks for you reply.
I'm trying to import this csv in a new lakehouse table
but I've this error:
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column name Descrizione contains invalid characters. ",;{}()\n\t=" are not supported.,Source=Microsoft.DataTransfer.ClientLibrary,'
I don't remember a such issue with Azure Data Factory.
Any suggests to me, please? Thanks
@pmscorca - Is it picking up the leading space in the " Descrizione" column? If you manually update the file and remove the space does it flow allt he way through? Lakehouse tables do not like spaces in Table or Column Names so that could be part of the problem here.
Hi, thanks for your reply.
I've removed the leading space, that I've noticed later, and so I can create a lakehouse table. Thanks