Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
The first Dataflow Gen2 to Lakehouse destionation is running ok.
But from the second I`m getting this error:
WriteToDataDestination: There was a problem refreshing the dataflow: "Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Error in comitting version., InnerException: We can't update the value. The operation or its arguments aren't supported by the value., Underlying error: We can't update the value. The operation or its arguments aren't supported by the value. Details: Reason = DataSource.Error;Message = We can't update the value. The operation or its arguments aren't supported by the value.;Detail = #table({"Version", "Published", "Data", "Modified"}, {});Message.Format = We can't update the value. The operation or its arguments aren't supported by the value. GatewayObjectId: 0669ef20-1e0a-4fce-8b46-8dfd31d7b8f3". Error code: 999999. (Request ID: a4a13727-406c-4933-ae3d-a73d47975096).
I`m using Lakehouse, Replace, Dynamic Schema.
Why this error just after the second refresh and it`s a replace table.
Solved! Go to Solution.
Hi @EdMk,
This error is coming from the gateway, not the Lakehouse or Dynamic schema itself.
The pattern “Error in committing version … We can’t update the value … Detail = #table({"Version","Published","Data","Modified"}, {}) … Error code: 999999” is a known issue with older on-premises data gateway versions when a Dataflow Gen2 writes to a Lakehouse and does a Replace on later refreshes.
Ask your admin to update the on-premises data gateway to the latest version (Manage gateways → select the gateway → if “Update” appears, run it, or download the latest build and install it).
After the gateway upgrade, re-publish the dataflow and run the refresh again.
If it still fails, try temporarily switching the destination mapping from Dynamic schema to a fixed schema just to confirm it’s only the gateway issue; then log a Fabric support ticket with the refreshed Request ID.
Also you can follow this link for your reference.
If this helps, please give it a kudos 👍 and mark it as Accepted Solution ✅ so others hitting the same Dataflow Gen2 Mashup error can find it easily.
Thanks & Regards
Shashi Paul
Thank you!
Yeah, after upgrade the gateway version the Dataflow Gen2 is working!
Bingo!
Glad to know that it has helped you to resolve your issue
Shashi P
Hi @EdMk,
This error is coming from the gateway, not the Lakehouse or Dynamic schema itself.
The pattern “Error in committing version … We can’t update the value … Detail = #table({"Version","Published","Data","Modified"}, {}) … Error code: 999999” is a known issue with older on-premises data gateway versions when a Dataflow Gen2 writes to a Lakehouse and does a Replace on later refreshes.
Ask your admin to update the on-premises data gateway to the latest version (Manage gateways → select the gateway → if “Update” appears, run it, or download the latest build and install it).
After the gateway upgrade, re-publish the dataflow and run the refresh again.
If it still fails, try temporarily switching the destination mapping from Dynamic schema to a fixed schema just to confirm it’s only the gateway issue; then log a Fabric support ticket with the refreshed Request ID.
Also you can follow this link for your reference.
If this helps, please give it a kudos 👍 and mark it as Accepted Solution ✅ so others hitting the same Dataflow Gen2 Mashup error can find it easily.
Thanks & Regards
Shashi Paul