March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
I am experiencing an issue with copying two tables from one lakehouse (with schema) to another lakehouse. The copy operation fails, but it succeeds when copying without the schema (using the same table name).
Error Message:
There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Pipeline execution failed (runId: a3eba383-b856-4821-9ec6-95d2c6c4bdaa). Operation on target ca-9a694c62-cba0-497c-8416-ffe320aea73d failed: Lakehouse table name should only contain letters, numbers, and underscores. The name must also be no more than 256 characters long. Details: Reason = DataSource.Error;RunId = a3eba383-b856-4821-9ec6-95d2c6c4bdaa'. Error code: Fast Copy User Error. (Request ID: e869082f-206a-4ce5-9bc1-ec5cd5a94b83).
one of the Table Names:
I have attempted to rename the table, but the issue persists.
Is it possible to set the destition lakehouse to have the same schema as the source? I have seen this issue when using the copy activity in a data pipeline bwteen two lakehouses which do not have the same schema (or when one has a schema and the other does not). I understand that this is not exactly the same, since I am referring to data pipelines and your issue is with Gen2 dataflows, but hopefully this gives you another angle to try.
Hi @Billybilly
Here I have some suggestions:
Ensure that the table name silver.Processsed_4ly_v follows the naming rules:
Only letters, numbers, and underscores are allowed.
The name should not exceed 256 characters.
Avoid special characters or spaces.
If you haven't already, try renaming the table to something simpler, like silver.Processed4ly. This can help rule out any issues related to specific characters or naming conventions.
Try copying a very simple table to see if the issue is related to the complexity of the original table.
Check the error log for more detailed information.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thank you for your response.
The table name matches the requirement shown in the error message.
I want to clarify that the data copy succeeds when the source is from the lakehouse without a schema. This makes me wonder if there might be a bug in Dataflow Gen 2 or the lakehouse configuration.
I also tried using a simple table name, but the issue persists.
Hi @Billybilly
In response to your case, please consider:
The column data types of the source and target tables are the same.
The number and name of columns in the source table and the target table must be the same. If the source table has additional columns or is missing columns, replication may fail. Ensure that the table structure matches exactly in both silos.
Check indexes and constraints (such as primary keys, foreign keys, unique constraints, etc.) on the source and target tables. These constraints should also exist in the target table to ensure data integrity and query performance.
You can test with some simple data.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @NONO Chen,
Thank you for the response and the detailed suggestions.
To clarify further, no constraints are applied in this scenario, and there have been no changes to the data types, column names, or the number of columns between the source and target tables. Both tables are identical in structure.
Let me reiterate the key observation:
Based on this, I suspect the issue may not be related to column data types, names, or constraints.
Looking forward to your insights!
Best regards,
Billybilly
Hi,
unfortunately I don't have a solution to this problem, but I would just like to add that I'm having the exact same issue.
My table names also don't contain any special characters.
Example of table name is: companynamesalesdata
I was able to fix my issue, in the source data I had the source set to the lakehouse directly, which didn't work. I had to make a SQL connection to the SQL endpoint of the lakehouse and there my query worked without issues.
I hope this can help you too
Hi Anne-MarijnV,
Thank you for providing the solution. I used to extract data using the SQL endpoint and noticed that the Dataflow Gen2 keeps spinning for more than an hour. Normally, this process only takes a few minutes.
I wonder if you're experiencing the same issue on your end.
Best regards,
Billybilly
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
2 | |
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
3 | |
2 | |
2 | |
2 | |
1 |