Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
I've created what is probaby the most basic dataflow there is - it connects to the Dataverse and grabs the contact table and creates a copy in a Lakehouse. The problem is, is that it fails with the following error:
contact_WriteToDataDestination: Mashup Exception Data Source Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Error in replacing table's content with new data in a version: #{0}., InnerException: #{0}: #{1}, Underlying error: Microsoft SQL: An error occurred while sending the request.
RequestId: TDS;5a9a7b5e-7d06-472d-8d9f-89c2bd11f625;7
Time: 2024-11-07T16:18:45.4380774Z Details: Reason = DataSource.Error;Microsoft.Data.Mashup.ErrorCode = Lakehouse036;Message = Microsoft SQL: An error occurred while sending the request.
RequestId: TDS;5a9a7b5e-7d06-472d-8d9f-89c2bd11f625;7
Time: 2024-11-07T16:18:45.4380774Z;Detail = [DataSourceKind = "CommonDataService", DataSourcePath = "operations-xxx.crm11.dynamics.com", Message = "An error occurred while sending the request.#(lf)RequestId: TDS;5a9a7b5e-7d06-472d-8d9f-89c2bd11f625;7#(lf)Time: 2024-11-07T16:18:45.4380774Z", ErrorCode = -2146232060, Number = 40000, Class = 16, State = 1];Message.Format = #{0}: #{1};Message.Parameters = {"Microsoft SQL", "An error occurred while sending the request.#(lf)RequestId: TDS;5a9a7b5e-7d06-472d-8d9f-89c2bd11f625;7#(lf)Time: 2024-11-07T16:18:45.4380774Z"};ErrorCode = 10478;Microsoft.Data.Mashup.Error.Context = User
I have created exactly the same dataflow for a different table (account) which has worked with no issue. Does anyone have any idea what the problem is?
Thanks!
Hi Can you try to delete lakehouse from destination and add one more time(if you are taking dataflow from yuor lakehouse and by default if the lakehouse is taking as destination)
or
you can edit the lakehouse connection while adding destination(if yuo are adding the lakehouse as destination).
try this 2 and let me know and also, i hope you are having system adminstrator role for dataverse env.
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @eyeballkid
Thank you very much FabianSchut for your prompt reply.
In response to the issue of Mashup errors, I've seen several people on the forums suggesting in various cases that highlighting all columns and selecting Keep rows -> Keep errors would solve the problem.
This way, you can effectively identify errors in your data. You can try this method, it might help you.
Solved: Mashup Exception Expression Error DataFlows 2 Gen - Microsoft Fabric Community
Solved: Dataflows Gen2 – Import – Mashup - DataFormat Erro... - Microsoft Fabric Community
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous,
Thanks for the information. I have tried the Keep Columns>Keep Errors option but trying this on the entire table seemed to time out so I ran it in sections but it generated no errors.
If I take a subset of columns (contactid, firstname, lastname) and run the dataflow it runs to completion, but if I run the entire table, it fails. I'm at a bit of a loss as to where to go!
Thanks!
Hi @eyeballkid
Thank you for your feedback.
I have some suggestions:
You mentioned that the data flow was successful in processing a subset of columns, consider checking the remaining columns. If some columns contain a large amount of data, this may cause a timeout.
If there are any transformations or calculations in the data stream, simplify them or remove them temporarily to see if that solves the problem.
Or try increasing the timeout limit.
I hope this has been of some help to you.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi, is the schema of the source and the destination table the same? You may delete the destination table once (if it already exists), to check if that is the case.
Hi FabianSchut, Many thanks for the reply.
I have tried deleting the table and re running the job but it still fails. It even fails on the first initial run when there is no table at all and it has to create one from the source schema, it's very frustrating!!
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.