Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello, I have a dataflow that is causing errors.
The full error message is:
JT_TASK_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Pipeline execution failed (runId: xxxxx-e36c-47ae-9377-9f5669aa9514). Operation on target ca-xxxxxx-0c53-4098-a285-519620e6cc83 failed: Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column name Work Task No contains invalid characters. ",;{}()\n\t=" are not supported.,Source=Microsoft.DataTransfer.ClientLibrary,' Details: Reason = DataSource.Error;RunId = xxxxx-e36c-47ae-9377-9f5669aa9514'. Error code: Fast Copy User Error. (Request ID: xxxxxxx-06c1-4bd6-9513-87bee9df9bf4).
The code where the column names are modified is as follows:
Solved! Go to Solution.
In follow up to my previous message, I have managed to overcome this frustrating issue.
I want to mention that I had already tried dropping the Lakehouse destination table, which didn't resolve the issue. I had also attempted to delete and recreate the destination on the dataflow query, without success.
I then went the route of recreating the query from scratch, literally duplicating the steps one by one, and THEN I set up the data destination. Once I had a duplicate query, I deleted the original query (in the dataflow) and republished and it finally worked.
These kinds of issues are frustrating, as they reveal Fabric's soft underbelly! But a lesson to me (and possibly to others) is, when in doubt, try starting from a clean slate.
In follow up to my previous message, I have managed to overcome this frustrating issue.
I want to mention that I had already tried dropping the Lakehouse destination table, which didn't resolve the issue. I had also attempted to delete and recreate the destination on the dataflow query, without success.
I then went the route of recreating the query from scratch, literally duplicating the steps one by one, and THEN I set up the data destination. Once I had a duplicate query, I deleted the original query (in the dataflow) and republished and it finally worked.
These kinds of issues are frustrating, as they reveal Fabric's soft underbelly! But a lesson to me (and possibly to others) is, when in doubt, try starting from a clean slate.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
4 | |
3 | |
2 | |
1 | |
1 |