Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
I currently have a lakehouse containing a Python notebook that queries a series of API's and retrieves data, converts the data to spark dataframes, and writes the data to tables inside the lakehouse. I need to copy/move these tables from the lakehouse to a warehouse within the same workspace. I have not found a good way to achieve this. I attempted to use a data factory pipeline to connect to the lakehouse and copy the tables over to the warehouse, but I got the following error:
ErrorCode=ParquetColumnIsNotDefinedInDeltaMetadata,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Invalid table! Parquet column is not defined in delta metadata. Column name: col-7040d188-5199-4794-a195-1d5b40fb7e3b.,Source=Microsoft.DataTransfer.DeltaDataFileFormatPlugin,'
Googling this error did not provide me with any guidance on how to fix this. Does anyone know how to fix this error? Is there a better way to move data from a lakehouse to a warehouse that I am not aware of?
Thank you for you help!
Hi @davidwolfson ,
Thank you for your interest in this case.
Was the above advice helpful in your situation? If this is the case, you may want to consider Accept's helpful suggestions to help others facing similar requirements.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @davidwolfson,
Did you ever get a resolution to this issue? I'm seeing the same thing when trying to copy data from a lakehouse to on prem SQL. Not all tables (from the same source) give the error. I've had a ticket open with Microsoft since August, but not getting much help there.
For now, I used a data flow to copy from a lakehouse to a warehouse. You may be able to set your on prem server as a destination as long as the database is an Azure SQL DB.
I also put in a support ticket and they suggested using the SQL endpoint of the lakehouse in the copy activity. I have not tried it but may be a viable workaround.
@davidwolfson - thank you!!! Both suggestions worked! Data flow really isn't a great option since it would be way too time consuming to stand up and maintain. However, the pipeline connected to the SQL endpoint did work. Thanks again, this should keep us moving until (if ever) the original issue is addressed.
@WomanToBlame Can you please explain how you connected to the SQL endpoint to the pipeline? Thanks!
@ebjim - use the Azure SQL Connector. For the server, use your SQL Connection String and for the database name use your Lakehouse name.
โ
It should be possible to use Data Pipeline Copy Activity to do this.
Data pipeline is supposed to be more efficient than Dataflow Gen2.
However, you could try the fast copy option in Dataflow Gen2 which should be more similar to Data Pipeline Copy Activity in terms of performance (however it's still only a preview feature, expected to turn GA sometime in Q3). https://learn.microsoft.com/en-us/fabric/data-factory/dataflows-gen2-fast-copy
-----
I would consider opening a support ticket regarding the error you are receiving.
Did you make any special settings or did you just go with the default settings when configuring the copy activity? I think default settings should be fine. Did you verify that there are no errors in the mapping?
I will open a support ticket. I used the default settings and the schema mapping was correct. I even tried doing it with the copy assistant and it gave me the same error.
Hi @davidwolfson ,
Thank you for your interest in this case.
Is my follow-up just to ask if the problem has been solved?
If so, can you accept the correct answer as a solution or share your solution to help other members find it faster?
Thank you very much for your cooperation!
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @davidwolfson ,
Thanks for the reply from kandersonit .
For me, using Data flow Gen2 to copy tables from lakehouse to warehouse works well.
I recommend that you double-check that you are doing the steps correctly.
You can use the โ+warehouseโ inside the warehouse to connect to the lakehouse you want to use in order to use the tables inside.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
A Dataflow Gen2 has the ability to Copy Data from a Lakehouse and set a Warehouse as the destination.
Create your first Microsoft Fabric dataflow - Microsoft Fabric | Microsoft Learn
For simple cases, Dataflow Gen2 works well. However it falls short for complex cases like the AdventureWorks database where binary data types are needed, for example, the "LargePhoto" column in the DimProduct table is not supported.