Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
bhanu_bi
Frequent Visitor

Using Copy activity parquet file format, all columns have data type as string

I am trying to use copy activity to copy a Snowflake table into Lakehouse Files as destination. I am using Parquet file format in the destination setting. After successfully loading the files, when I am using notebook to create table based on that files, I am getting all the column datatype as string.

 

I have updated the mapping in Copy activity however still the same.

 

bhanu_bi_0-1720514851806.pngbhanu_bi_1-1720514875174.pngbhanu_bi_2-1720514910648.png

 

As seen O_ORDERKEY is showing as string instead of number.

Is this an expected behavior?

Is there something I am missing?

Can you suggest how to pick the data type which has already been defined in mapping for copy activity.

1 ACCEPTED SOLUTION

The issue in this case was with Snowflake Instance. We tried same workflow with Azure SQL and it worked fine. In case you are using Snowflake and facing this issue, check out the query execution in Snowflake monitoring queries and you will notice the file format is CSV, you may have to do additional tweaking to get it to parquet.

 

If you disable the staging during this operation, pipeline doesn't work as well. That's one more thing to checkout.

View solution in original post

4 REPLIES 4
bhanu_bi
Frequent Visitor

Hi @Anonymous Thank you for the tip. We enabled the check-box, however still the same behavior is persisting.

We are copying from Snowflake SQL to Lakehouse Files in parquet format.

In your screenshot - Processed Table is inside Lakehouse Tables location.

Any processing that needs to be done.

Thank you.

 

The issue in this case was with Snowflake Instance. We tried same workflow with Azure SQL and it worked fine. In case you are using Snowflake and facing this issue, check out the query execution in Snowflake monitoring queries and you will notice the file format is CSV, you may have to do additional tweaking to get it to parquet.

 

If you disable the staging during this operation, pipeline doesn't work as well. That's one more thing to checkout.

Anonymous
Not applicable

Hi @bhanu_bi ,

 

Good, thanks for your solution and feedback.

 

Best regards,

Adamk Kong

Anonymous
Not applicable

Hi @bhanu_bi ,

 

When you change the data type during the mapping process, you need to check the box in front of the specific field that needs to be changed. Otherwise the result of copy data will not take effect.

vkongfanfmsft_1-1720598147158.png

successfully changed:

vkongfanfmsft_2-1720598166400.png

 

Best Regards,
Adamk Kong

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.