Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
i am very new to Fabric. i have a dataflow setup, with a semantic model and report which all works, however i am getting the following error on a dataflow gen2 scheduled refresh. The destination for the data is in Lakehouse. My only fix a the minute is to delete the table out of the Lakhouse and then it recreates automatically but subsequent refreshes fail.
Private data below replaced with xxxxxxxxxxxxxxxx
any help appreciated. i have tried deleting the tables and recreating but every refresh fails.
xxxxxx_WriteToDataDestination: There was a problem refreshing the dataflow: "Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Error in replacing table's content with new data in a version: #{0}., InnerException: #{0} failed to get contents from '#{1}'. Status code: #{2}, description: '#{3}'., Underlying error: AzureDataLakeStorage failed to get contents from xxxxxxxxxxxx. Status code: 409, description: 'The specified path already exists.'. Details: Reason = DataSource.Error;ErrorCode = Lakehouse036;Message = AzureDataLakeStorage failed to get contents from 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'. Status code: 409, description: 'The specified path already exists.'.;Detail = [DataSourceKind = "Lakehouse", DataSourcePath = "Lakehouse", DataSourceKind.2 = "AzureDataLakeStorage", DataSourcePath.2 = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx", HttpStatusCode = 409, HttpStatusDescription = "The specified path already exists."];Message.Format = #{0} failed to get contents from '#{1}'. Status code: #{2}, description: '#{3}'.;Message.Parameters = {"AzureDataLakeStorage", "xxxxxxxxxxxxxxxxxxxxxxxxxx", 409, "The specified path already exists."};ErrorCode = 10266;Microsoft.Data.Mashup.Error.Context = User GatewayObjectId: b01b6364-d16a-4eaf-80d5-8e7a3c7b7348". Error code: 104100. (Request ID: 5c8f0582-c49c-46a0-9476-e05bec896ac1).
Solved! Go to Solution.
Hi @malcolmph,
we identified the root cause and we should have fixed it. Can you please try to refresh the dataflow again and let us know if the issue persists?
Thanks,
Alessandro
Hi @malcolmph ,
we are investigating the issue and will come back to you once we have more details.
Meanwhile, if your dataflow does not access on-premise data sources, you could try to not use the gateway and see if you are able to write to the Lakehouse.
Regards,
Alessandro
Hi @malcolmph,
we identified the root cause and we should have fixed it. Can you please try to refresh the dataflow again and let us know if the issue persists?
Thanks,
Alessandro
Hello @malcolmph ,
Do you have a "Replace" in "Update Method" on the Data Destination ?
Have a nice day,
Vivien
yes i have replace