The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hi,
I've tried to create a new table in a lakehouse at Silver layer by using a copy activity in a data pipeline in order to import data from a csv file, but it seems that the table couldn't be created.
Using a dataflow gen2 it is possible to create a new table in a lakehouse without using a notebook Spark.
Now, is there a manner to create a new lakehouse table by a data pipeline without using a notebook Spark? Thanks
Solved! Go to Solution.
Not sure what you mean by new table couldnt be created.
In copy activity, under destination :
sink output :
Not sure what you mean by new table couldnt be created.
In copy activity, under destination :
sink output :
Hi, thanks for you reply.
I'm trying to import this csv in a new lakehouse table
but I've this error:
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column name Descrizione contains invalid characters. ",;{}()\n\t=" are not supported.,Source=Microsoft.DataTransfer.ClientLibrary,'
I don't remember a such issue with Azure Data Factory.
Any suggests to me, please? Thanks
@pmscorca - Is it picking up the leading space in the " Descrizione" column? If you manually update the file and remove the space does it flow allt he way through? Lakehouse tables do not like spaces in Table or Column Names so that could be part of the problem here.
Hi, thanks for your reply.
I've removed the leading space, that I've noticed later, and so I can create a lakehouse table. Thanks