Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
Hi,
I'm having issues with a copy activity I am running. It's a pipeline that has a for each setup that runs through an array of variables to a GET request for an API. The output is a parquet file that goes into a azure dfs storage, where I then copy to a lakehouse table.
When I try and push from the storage to the lakehouse table, I am constantly getting this error.
Activity failed because an inner activity failed; Inner activity name: Copy data6, Error: ErrorCode=ParquetColumnNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column schema.table.date does not exist in Parquet file.,Source=Microsoft.DataTransfer.Richfile.ParquetTransferPlugin,'
The setup for the Copy data6 is set up with File path type: File Path; Format: Parquet; and the mapping has been set.
Can anyone give me some guidance?
@rwhitePWT - Is it possible to post a picture of the pipeline itself and the mapping for the 2nd Copy Step?
If I am following correctly:
- You have an Array Variable defined in your Pipeline which has the details on how to pull data
- The ForEach loops through this Array
- In the Child Definition; you have a Copy Activity making an API call (GET) with the results getting copied to ADLS Gen2
- You have a second copy step that is moving that file from ADLS Gen2 to a Fabric Lakehouse (Error happens here)
Questions:
- Curious as to why you are copying the data to ADLS Gen2 but then bringing it back into Fabric; instead of just writing the API output to the "Files" section of Fabric. There are definitely legitimate reasons for doing this; just curious.
- Is this happening to the output of all the API files or do some make the round trip just fine?
- For the file written to ADLS Gen2 are all of your date fields set to datetime as the data type vs just Date? I have run into issues where creating a Lakehouse Table with a "Date" datatype will fail because it needs to be a DateTime. Might be worth checking out.