Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
DataSkills
Resolver I
Resolver I

Dataflow fails with error "contains invalid characters. ",;{}()\n\t=" are not supported"

Hello, I have a dataflow that is causing errors. 

 

The full error message is:


JT_TASK_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Pipeline execution failed (runId: xxxxx-e36c-47ae-9377-9f5669aa9514). Operation on target ca-xxxxxx-0c53-4098-a285-519620e6cc83 failed: Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column name Work Task No contains invalid characters. ",;{}()\n\t=" are not supported.,Source=Microsoft.DataTransfer.ClientLibrary,' Details: Reason = DataSource.Error;RunId = xxxxx-e36c-47ae-9377-9f5669aa9514'. Error code: Fast Copy User Error. (Request ID: xxxxxxx-06c1-4bd6-9513-87bee9df9bf4).

 

The code where the column names are modified is as follows:

Table.RenameColumns(#"Removed other columns", {{"TASK_SEQ", "Work Task No"}, {"WO_NO","WO No"}, {"SITE","WT Site"}, {"ORGANIZATION_SITE","Maint Org Site"}, {"ORGANIZATION_ID","Maint Org"}, {"WORK_TYPE_ID","WT Work Type ID"}, {"DESCRIPTION","Description"}, {"LONG_DESCRIPTION","Long Description"}, {"CREATED_BY","Task Created By"}, {"CREATED_DATE","WT Created Date"}, {"PREPARED_BY","Prepared By"}, {"REPORTED_BY","Reported By"}, {"REPORTED_DATE","Work Task Reported Date"}, {"ACTUAL_START","WT Start Date"}, {"ACTUAL_FINISH","WT Finish Date"}, {"SLA_LATEST_FINISH","SLA Date"}, {"EXCLUDE_FROM_SCHEDULING","Excluded from Scheduling"}, {"EXCLUDE_FROM_SCHEDULING_DB","Exclude from Scheduling"}, {"ADJUSTED_DURATION","Adjusted Duration"}, {"REMARK","Remark"}, {"INTERNAL_REMARK","Internal Remark"}, {"ACTION_TAKEN","Action Taken"}, {"CANCEL_CAUSE","Cancel Cause"}, {"ERROR_CAUSE_LONG","Error Cause Long"}, {"ERROR_TYPE","Error Type"}, {"ERROR_CLASS","Error Class"}, {"ERROR_DISCOVER_CODE","Error Discovery Code"}, {"ERROR_SYMPTOM","Error Symptom"}, {"ITEM_CLASS_ID","Item Class ID"}, {"ERROR_CAUSE","Error Cause"}, {"FAILING_COMPONENT","Failing Component"}, {"PERFORMED_ACTION_ID","Performed Action ID"}, {"PERFORMED_WORK","Performed Work"}, {"VENDOR_NO","Supplier No"}, {"CONTACT","Contact"}, {"CONTACT_PHONE_NO","Contact Phone No"}, {"E_MAIL","Email"}, {"SOURCE_REF1","Source Reference"}, {"CHANGED_DATE","Changed Date"}, {"C_ORIGIN_WORK_TASK","Original Work Task"}, {"C_COPIED_REVISIT_TASK","Copied Revisit Task"}, {"C_FUNCTIONAL_SITE","Functional Site"}, {"STATE","WT Status"}, {"OBJKEY","Task OBJKEY"}})
 
The error message appears to suggest that it's that very first column "Work Task No" which has invalid characters. I don't know WHY it's saying that as there are no invalid characters. I have this identical name being used elsewhere in my lakehouse. This is really frustrating me. Can anyone suggest what I have overlooked? 
 
Thanks in advance. 
1 ACCEPTED SOLUTION
DataSkills
Resolver I
Resolver I

In follow up to my previous message, I have managed to overcome this frustrating issue.

 

I want to mention that I had already tried dropping the Lakehouse destination table, which didn't resolve the issue. I had also attempted to delete and recreate the destination on the dataflow query, without success. 

 

I then went the route of recreating the query from scratch, literally duplicating the steps one by one, and THEN I set up the data destination. Once I had a duplicate query, I deleted the original query (in the dataflow) and republished and it finally worked. 

 

These kinds of issues are frustrating, as they reveal Fabric's soft underbelly! But a lesson to me (and possibly to others) is, when in doubt, try starting from a clean slate. 

View solution in original post

1 REPLY 1
DataSkills
Resolver I
Resolver I

In follow up to my previous message, I have managed to overcome this frustrating issue.

 

I want to mention that I had already tried dropping the Lakehouse destination table, which didn't resolve the issue. I had also attempted to delete and recreate the destination on the dataflow query, without success. 

 

I then went the route of recreating the query from scratch, literally duplicating the steps one by one, and THEN I set up the data destination. Once I had a duplicate query, I deleted the original query (in the dataflow) and republished and it finally worked. 

 

These kinds of issues are frustrating, as they reveal Fabric's soft underbelly! But a lesson to me (and possibly to others) is, when in doubt, try starting from a clean slate. 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.