Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now

Reply
amaaiia
Super User
Super User

Add new column in Data Pipeline source as Date type

Hi,

I'm trying to write some data with COPY activity into a lakehouse. I need to add a customized column to Destination lakehouse table. This column is a date. However, when I write it in the lakehouse table, it's written as string. How can I set this column to date type? I've tried with formating the value to datetime: 

@{formatDateTime(item().file_date,'yyyy-MM-dd')}

amaaiia_0-1730304365375.png

But it still writes it as string.

Please don't tell me to set the column type in Mapping tab, because this Data Pipeline is going to be used with diferent sources, that is, different schemas. So I can't specify a fixed data schema.

Any ideas?

5 REPLIES 5
FabianSchut
Super User
Super User

I did not test this myself, but in the Mapping settings, you are able to set the 'Type conversation settings'. This does not require a fixed data schema, but allows you to set a date string format that should be read as a date type. If you set the 'Date format' to "yyyy-MM-dd" in combination with your already formatted date string @{formatDateTime(item().file_date,'yyyy-MM-dd')}, it may work. Here is the documentation: https://learn.microsoft.com/en-us/fabric/data-factory/data-type-mapping.
Do note that not all sources are supported. The list of supported items is in the link.

Hi,

Not working. I've set Date format as I have specified in formatDateTime function 

amaaiia_0-1730371639734.png

 

and if I check the destination table, the filed is still in string format: 

amaaiia_1-1730371709191.png

 

Just to be sure, did you drop the table before running with this new setting?

v-nuoc-msft
Community Support
Community Support

Hi @amaaiia 

 

It seems difficult to achieve your purpose in the pipeline. I recommend that you do this in dataflow. It is more convenient to clean and transform data in dataflow.

 

After connecting to the data source in dataflow, you can add custom date columns.

 

vnuocmsft_0-1730339023683.png

 

vnuocmsft_1-1730339034105.png

 

Set the destination as lakehouse.

 

vnuocmsft_2-1730339159087.png

 

vnuocmsft_0-1730340116256.png

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

 

I need to develop this use case with Data pipeline because I need to parameterize all the flow. DFg2 is not good to work with parameters. I need to ingest some tables provided in a warehouse as master data, so I need to iterate through each row of the warehouse to ingest each table specified in the warehouse table. The destination tables, the source, the schemas... of each table is different. Is difficult to develop this in a dataflow.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

FebFBC_Carousel

Fabric Monthly Update - February 2025

Check out the February 2025 Fabric update to learn about new features.

Feb2025 NL Carousel

Fabric Community Update - February 2025

Find out what's new and trending in the Fabric community.