Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
ringovski
Helper II
Helper II

Copy Data to Lakehouse - Edit Columns

Hi All,

I am copying data from a on premise SQL data source with a series of copy-assistant 'for each loops' to a lakehouse. However one of loops is failing due to a space in a column destination name.

 

ringovski_0-1742432096226.jpeg

 

Problem is I am trying to find a way to edit the columns without deleting the whole thing and I can't find anyway. Is this possible?

 

Thanks

2 ACCEPTED SOLUTIONS
suparnababu8
Super User
Super User

Hello @ringovski 

 

If your column header having spaces, your lakehosue won't allow to ,load the csv file into destination. You can read this blog Optimizing CSV Ingestion: Transforming Space-Delim... - Microsoft Fabric Community It might help you to resolve your problem

 

Thank you!!!

 

Did I answer your question? Mark my post as a solution!

Proud to be a Super User!

View solution in original post

v-tsaipranay
Community Support
Community Support

Hi @ringovski ,

Thank you for reaching out to the Microsoft Fabric Community and providing the details.

 

After reviewing your setup and the conversation, I can confirm that @suparnababu8  gave the right advice. The issue happens because Lakehouse doesn’t support column names with spaces, which is why your pipeline fails.

The best way to fix this without rebuilding the whole pipeline  is to add a Dataflow Gen2 activity before the Copy Data activity. This lets you pull the data from SQL, rename columns to remove spaces (e.g., Customer Name to Customer_Name), and send the clean data to your Lakehouse. Once the columns are cleaned up in Dataflow Gen2, the Copy Data activity should work without errors.

You can find detailed steps on setting up Dataflow Gen2 here: Create your first Dataflow Gen2 in Microsoft Fabric.

 

I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you. 

View solution in original post

9 REPLIES 9
v-tsaipranay
Community Support
Community Support

Hi @ringovski ,

Thank you for reaching out to the Microsoft Fabric Community and providing the details.

 

After reviewing your setup and the conversation, I can confirm that @suparnababu8  gave the right advice. The issue happens because Lakehouse doesn’t support column names with spaces, which is why your pipeline fails.

The best way to fix this without rebuilding the whole pipeline  is to add a Dataflow Gen2 activity before the Copy Data activity. This lets you pull the data from SQL, rename columns to remove spaces (e.g., Customer Name to Customer_Name), and send the clean data to your Lakehouse. Once the columns are cleaned up in Dataflow Gen2, the Copy Data activity should work without errors.

You can find detailed steps on setting up Dataflow Gen2 here: Create your first Dataflow Gen2 in Microsoft Fabric.

 

I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you. 

Hi @ringovski ,

 

I wanted to follow up on our previous suggestions regarding the issue. We would love to hear back from you to ensure we can assist you further.

If my response has addressed your query, please accept it as a solution and give a ‘Kudos’ so other members can easily find it. Please let us know if there’s anything else we can do to help.

 

Thank you.

Hi @ringovski  ,

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

Hi @ringovski ,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

suparnababu8
Super User
Super User

Hello @ringovski 

 

If your column header having spaces, your lakehosue won't allow to ,load the csv file into destination. You can read this blog Optimizing CSV Ingestion: Transforming Space-Delim... - Microsoft Fabric Community It might help you to resolve your problem

 

Thank you!!!

 

Did I answer your question? Mark my post as a solution!

Proud to be a Super User!

Thanks for the reply however this is not my issue and doesn't resolve my problem.

Hi @ringovski 

 

You can copy your files in this way, On-premises-->Dataflows-->Lakehouse (By using pipelines). Bcz if you have spaces in your column headers it will not sucess. Just give a try in this way.

 

Thank you!!!

 

Did I answer your question? Mark my post as a solution!

Proud to be a Super User!

This is a pipeline, you can see it in the bottom of the image.

Hi @ringovski 

 

I saw that. I'm telling you add Dataflows and copy activity in the pipeline then you can execute your pipeline. Hope it will work.

 

Thank you!!!

 

Did I answer your question? Mark my post as a solution!

Proud to be a Super User!

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.