Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join the OneLake & Platform Admin teams for an ask US anything on July 16th. Join now.

Reply
Anonymous
Not applicable

Whitespace in source column name rejected by copy activity

I am using a pipeline with a copy activity to get data from a SQL server. Since I loop through a list of tables, I get the mapping from a meta table that I prepare beforehand. However, if the source column name contains a space, I get the following error:

 

Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column name Location Code contains invalid characters. ",;{}()\n\t=" are not supported.,Source=Microsoft.DataTransfer.ClientLibrary,'

 

However, the failure is certainly NOT happening on the 'destination' side. The schema looks like this (it works neither with the encompassing backticks nor without them):

 

{
	"type": "TabularTranslator",
	"mappings": [
		{
			"source": { "name": "`timestamp`" },
			"sink": { "name": "timestamp" }
		},
		{
			"source": { "name": "`Location Code`" },
			"sink": { "name": "Location_Code" }
		},
	]
}

 

Since the space is in the name of the source column, I can't change it. So what do I do?

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Alright, I got it and it was trivial in hindsight: The translator above is a literal string. In order to pass it correctly as a parameter, you have to put "@json(mapping_string)" instead of what I did, which was simply "@mapping_string".

View solution in original post

6 REPLIES 6
priyankabis
Helper I
Helper I

Isn't there a way or setting like we do in notebooks so that it can read columns with Spaces and save with _ instead od space

tarektm7
New Member

I was facing the same issue in a similar situation. 

The only workaround I could do is to use a notebook and get the data in a dataframe then write it into a delta table as below:

df.write.option('delta.columnMapping.mode', 'name').save(path='Tables/schemaname/tablename', format= 'delta', mode='overwrite')
chrichards
Regular Visitor

Hi - were you able to resolve this?  I'm encountering the same issue...

Anonymous
Not applicable

Hi @Anonymous 

 

You can try using square brackets as the source column name instead of backquotes. For example:

 

{
    "type": "TabularTranslator",
    "mappings": [
        {
            "source": { "name": "[timestamp]" },
            "sink": { "name": "timestamp" }
        },
        {
            "source": { "name": "[Location Code]" },
            "sink": { "name": "Location_Code" }
        }
    ]
}

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Anonymous
Not applicable

Nice try, but it doesn't work, yielding the same error message.

 

Just in case, the input for the copy activity really looks like this:

{
    "source": {
        "type": "SqlServerSource",
        "queryTimeout": "02:00:00",
        "partitionOption": "None",
        "datasetSettings": {
            "annotations": [],
            "type": "SqlServerTable",
            "schema": [],
            "typeProperties": {
                "schema": "dbo",
                "table": "ABCDEFG",
                "database": "HIJKLMNOP"
            },
            "externalReferences": {
                "connection": "684354684651687654657684"
            }
        }
    },
    "sink": {
        "type": "LakehouseTableSink",
        "tableActionOption": "OverwriteSchema",
        "partitionOption": "None",
        "datasetSettings": {
            "annotations": [],
            "linkedService": {
                "name": "Sandbox",
                "properties": {
                    "annotations": [],
                    "type": "Lakehouse",
                    "typeProperties": {
                        "workspaceId": "XXX",
                        "artifactId": "XXX",
                        "rootFolder": "Tables"
                    }
                }
            },
            "type": "LakehouseTable",
            "schema": [],
            "typeProperties": {
                "table": "XXX"
            }
        }
    },
    "enableStaging": false,
    "translator": "{\n\t\"type\": \"TabularTranslator\",\n\t\"mappings\": [\n\t\t{\n\t\t\t\"source\": { \"name\": \"[timestamp]\" },\n\t\t\t\"sink\": { \"name\": \"timestamp\" }\n\t\t},\n\t\t{\n\t\t\t\"source\": { \"name\": \"[Location Code]\" },\n\t\t\t\"sink\": { \"name\": \"Location_Code\" },  *[here I removed the other columns]* 
 \n\t]\n}"
}
Anonymous
Not applicable

Alright, I got it and it was trivial in hindsight: The translator above is a literal string. In order to pass it correctly as a parameter, you have to put "@json(mapping_string)" instead of what I did, which was simply "@mapping_string".

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors