Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi,
I am working on a process to take Microsoft Forms data and put it into a database. To do this, I am doing the following steps:
1: Power Automate flow that runs when a response is submitted to the form, and saves the response in an excel file on SPO.
2: Dataflow gen 2 to load the data from the Excel file into a lakehouse.
For step 2, when I first created the dataflow and ran it, it worked perfectly. Howver on every subsequent run, it has failed.
Here is the error message I am getting:
Query (2)_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Error in replacing table's content with new data in a version: #{0}., InnerException: #{0} failed to get contents from '#{1}'. Status code: #{2}, description: '#{3}'., Underlying error: AzureDataLakeStorage failed to get contents from 'https://onelake.dfs.fabric.microsoft.com/XXX/XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version'. Status code: 409, description: 'The specified path already exists.'. Details: Reason = DataSource.Error;ErrorCode = Lakehouse036;Message = AzureDataLakeStorage failed to get contents from 'https://onelake.dfs.fabric.microsoft.com/XXX/XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version'. Status code: 409, description: 'The specified path already exists.'.;Detail = [DataSourceKind = "Lakehouse", DataSourcePath = "Lakehouse", DataSourceKind.2 = "AzureDataLakeStorage", DataSourcePath.2 = "https://onelake.dfs.fabric.microsoft.com/XXX/XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version", Url = "https://onelake.dfs.fabric.microsoft.com/XXX/4XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version"];Message.Format = #{0} failed to get contents from '#{1}'. Status code: #{2}, description: '#{3}'.;Message.Parameters = {"AzureDataLakeStorage", "https://onelake.dfs.fabric.microsoft.com/XXX/XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version", 409, "The specified path already exists."};ErrorCode = 10266;Microsoft.Data.Mashup.Error.Context = User GatewayObjectId: XXX'. Error code: 104100. (Request ID: XXX).
This is complaining that lakehouse files already exist, but I'm not sure how to fix this. The dataflow is configured to replace data in the table, so it should be overwriting it. Does anyone know what I'm missing here?
Thanks!
Hi @tayloramy
You need to make sure there is correct data type when you are sending Excel file because some of the time, data type is not correct when we are sending Excel File. Also make sure that Dataflow Gen 2, there is correct data type and open the data in lakehouse to schema evolution. ( the reason is parquet and excel both have different data types ( STRING in parquet with _delta_log vs STRING/NUMBER/TEXT FORMAT in Excel )
Instead use python, Life becomes so simple. Python --> Data Lake ( Apache Spark ) --> Delta Lake ( Lakehouse )
Hi @BhaveshPatel,
I would prefer to use a notebook, however that makes auth with SPO challenging as I'm not allowed to have a service principal for this site - I need to use OAUTH.
Do you know of a nice way to handle accesing SPO from a notebook while using a user's credentials? Note that my org also has MFA enforced for access outside of our network.
Thanks,
Taylor
Hi @tayloramy
I just wanted to check in have you been able to make any progress with accessing SPO from a notebook using OAuth and MFA?
Hi,
I have not, my org enforces MFA when signing into SPO so I can't have something run automatically. This is why I was using dataflow gen 2, that connection will cache the token and ask for MFA on initial set up, but then works fine automatically from then on.
Hi @tayloramy
Glad to hear your issue has been resolved. If any of the responses here helped address the problem, we’d appreciate it if you could mark it as a solution. This will also help other community members who might face a similar issue.
Thanks for being part of fabric community forum.
Hi @tayloramy
Thanks for confirming. Since the issue is not yet resolved, could you please share the latest error message?
Hi @tayloramy
Just following up to check if you were able to capture the latest error message. Sharing that will help us look into the issue further.
These tokens expire. Be prepared to reauthenticate periodically.
Hi @tayloramy
Thanks for sharing the error details. From the logs, the refresh is failing because the dataflow is trying to replace the table contents, but a temporary file path (_mashup_temporary) already exists in the Lakehouse. This results in a 409 – Conflict error.
Here are a few things you can check:
Hope this helps !!
Hi @v-aatheeque