Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateJoin the OneLake & Platform Admin teams for an ask US anything on July 16th. Join now.
I am using a pipeline with a copy activity to get data from a SQL server. Since I loop through a list of tables, I get the mapping from a meta table that I prepare beforehand. However, if the source column name contains a space, I get the following error:
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column name Location Code contains invalid characters. ",;{}()\n\t=" are not supported.,Source=Microsoft.DataTransfer.ClientLibrary,'
However, the failure is certainly NOT happening on the 'destination' side. The schema looks like this (it works neither with the encompassing backticks nor without them):
{
"type": "TabularTranslator",
"mappings": [
{
"source": { "name": "`timestamp`" },
"sink": { "name": "timestamp" }
},
{
"source": { "name": "`Location Code`" },
"sink": { "name": "Location_Code" }
},
]
}
Since the space is in the name of the source column, I can't change it. So what do I do?
Solved! Go to Solution.
Alright, I got it and it was trivial in hindsight: The translator above is a literal string. In order to pass it correctly as a parameter, you have to put "@json(mapping_string)" instead of what I did, which was simply "@mapping_string".
Isn't there a way or setting like we do in notebooks so that it can read columns with Spaces and save with _ instead od space
I was facing the same issue in a similar situation.
The only workaround I could do is to use a notebook and get the data in a dataframe then write it into a delta table as below:
Hi - were you able to resolve this? I'm encountering the same issue...
Hi @Anonymous
You can try using square brackets as the source column name instead of backquotes. For example:
{
"type": "TabularTranslator",
"mappings": [
{
"source": { "name": "[timestamp]" },
"sink": { "name": "timestamp" }
},
{
"source": { "name": "[Location Code]" },
"sink": { "name": "Location_Code" }
}
]
}
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Nice try, but it doesn't work, yielding the same error message.
Just in case, the input for the copy activity really looks like this:
{
"source": {
"type": "SqlServerSource",
"queryTimeout": "02:00:00",
"partitionOption": "None",
"datasetSettings": {
"annotations": [],
"type": "SqlServerTable",
"schema": [],
"typeProperties": {
"schema": "dbo",
"table": "ABCDEFG",
"database": "HIJKLMNOP"
},
"externalReferences": {
"connection": "684354684651687654657684"
}
}
},
"sink": {
"type": "LakehouseTableSink",
"tableActionOption": "OverwriteSchema",
"partitionOption": "None",
"datasetSettings": {
"annotations": [],
"linkedService": {
"name": "Sandbox",
"properties": {
"annotations": [],
"type": "Lakehouse",
"typeProperties": {
"workspaceId": "XXX",
"artifactId": "XXX",
"rootFolder": "Tables"
}
}
},
"type": "LakehouseTable",
"schema": [],
"typeProperties": {
"table": "XXX"
}
}
},
"enableStaging": false,
"translator": "{\n\t\"type\": \"TabularTranslator\",\n\t\"mappings\": [\n\t\t{\n\t\t\t\"source\": { \"name\": \"[timestamp]\" },\n\t\t\t\"sink\": { \"name\": \"timestamp\" }\n\t\t},\n\t\t{\n\t\t\t\"source\": { \"name\": \"[Location Code]\" },\n\t\t\t\"sink\": { \"name\": \"Location_Code\" }, *[here I removed the other columns]*
\n\t]\n}"
}
Alright, I got it and it was trivial in hindsight: The translator above is a literal string. In order to pass it correctly as a parameter, you have to put "@json(mapping_string)" instead of what I did, which was simply "@mapping_string".
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.