Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
tayloramy
Resident Rockstar
Resident Rockstar

Error in Dataflow Gen 2 loading from Excel file on SharePoint Online

Hi, 

 

I am working on a process to take Microsoft Forms data and put it into a database. To do this, I am doing the following steps: 

 

1: Power Automate flow that runs when a response is submitted to the form, and saves the response in an excel file on SPO. 

2: Dataflow gen 2 to load the data from the Excel file into a lakehouse. 

 

For step 2, when I first created the dataflow and ran it, it worked perfectly. Howver on every subsequent run, it has failed. 

 

Here is the error message I am getting: 

 

Query (2)_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Error in replacing table's content with new data in a version: #{0}., InnerException: #{0} failed to get contents from '#{1}'. Status code: #{2}, description: '#{3}'., Underlying error: AzureDataLakeStorage failed to get contents from 'https://onelake.dfs.fabric.microsoft.com/XXX/XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version'. Status code: 409, description: 'The specified path already exists.'. Details: Reason = DataSource.Error;ErrorCode = Lakehouse036;Message = AzureDataLakeStorage failed to get contents from 'https://onelake.dfs.fabric.microsoft.com/XXX/XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version'. Status code: 409, description: 'The specified path already exists.'.;Detail = [DataSourceKind = "Lakehouse", DataSourcePath = "Lakehouse", DataSourceKind.2 = "AzureDataLakeStorage", DataSourcePath.2 = "https://onelake.dfs.fabric.microsoft.com/XXX/XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version", Url = "https://onelake.dfs.fabric.microsoft.com/XXX/4XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version"];Message.Format = #{0} failed to get contents from '#{1}'. Status code: #{2}, description: '#{3}'.;Message.Parameters = {"AzureDataLakeStorage", "https://onelake.dfs.fabric.microsoft.com/XXX/XXX/Tables/dbo/ContractedOccupancy/_delta_log/_mashup_temporary/_mashup_temporary_6044a1d6-e8e3-40bc-8cc1-a8fc45b434a7.version", 409, "The specified path already exists."};ErrorCode = 10266;Microsoft.Data.Mashup.Error.Context = User GatewayObjectId: XXX'. Error code: 104100. (Request ID: XXX).

 

This is complaining that lakehouse files already exist, but I'm not sure how to fix this. The dataflow is configured to replace data in the table, so it should be overwriting it. Does anyone know what I'm missing here? 

 

Thanks! 

 

12 REPLIES 12
BhaveshPatel
Community Champion
Community Champion

Hi @tayloramy 

 

You need to make sure there is correct data type when you are sending Excel file because some of the time, data type is not correct when we are sending Excel File. Also make sure that Dataflow Gen 2, there is correct data type and open the data in lakehouse to schema evolution. ( the reason is parquet and excel both have different data types ( STRING in parquet with _delta_log  vs STRING/NUMBER/TEXT FORMAT in Excel )

 

 

Instead use python, Life becomes so simple. Python --> Data Lake ( Apache Spark ) --> Delta Lake ( Lakehouse )

 

 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

Hi @BhaveshPatel

 

I would prefer to use a notebook, however that makes auth with SPO challenging as I'm not allowed to have a service principal for this site - I need to use OAUTH. 

 

Do you know of a nice way to handle accesing SPO from a notebook while using a user's credentials? Note that my org also has MFA enforced for access outside of our network. 

 

Thanks,

Taylor

Hi @tayloramy 

I just wanted to check in have you been able to make any progress with accessing SPO from a notebook using OAuth and MFA?

Hi, 

I have not, my org enforces MFA when signing into SPO so I can't have something run automatically. This is why I was using dataflow gen 2, that connection will cache the token and ask for MFA on initial set up, but then works fine automatically from then on. 

Hi @tayloramy 

Glad to hear your issue has been resolved. If any of the responses here helped address the problem, we’d appreciate it if you could mark it as a solution. This will also help other community members who might face a similar issue.

Thanks for being part of fabric community forum.





Hi @v-aatheeque, The issue has not been resolved.  

Hi @tayloramy 

Thanks for confirming. Since the issue is not yet resolved, could you please share the latest error message?

Hi @tayloramy 

Just following up to check if you were able to capture the latest error message. Sharing that will help us look into the issue further.

Hi @v-aatheeque

 

It is the same error message I posted earlier.  

These tokens expire. Be prepared to reauthenticate periodically.

v-aatheeque
Community Support
Community Support

Hi @tayloramy 

Thanks for sharing the error details. From the logs, the refresh is failing because the dataflow is trying to replace the table contents, but a temporary file path (_mashup_temporary) already exists in the Lakehouse. This results in a 409 – Conflict error.

 

Here are a few things you can check:

  • Since your dataflow replaces table data, confirm the overwrite option is correctly configured.
  • Make sure no other refresh/pipeline is writing to the same entity at the same time.
  • Failed or interrupted runs may leave behind _mashup_temporary files. Deleting these can often resolve the issue.
  • If you’re using Power Automate or scheduling frequent refreshes, allow some buffer time before the next attempt.
  • As a quick isolation step, try refreshing with a new Excel file stored in SharePoint to check if the issue is tied to the current path.

Hope this helps !!

Hi @v-aatheeque

 

  • Since your dataflow replaces table data, confirm the overwrite option is correctly configured.
    • Yes, this is configured properly
  • Make sure no other refresh/pipeline is writing to the same entity at the same time.
    • Nothing else is touching this lakehouse at all
  • Failed or interrupted runs may leave behind _mashup_temporary files. Deleting these can often resolve the issue.
    • I deleted these files from OneLake Explorer, and the issue still remains
  • If you’re using Power Automate or scheduling frequent refreshes, allow some buffer time before the next attempt.
    • I am not, this is a daily refresh and no more frequent than that.
  • As a quick isolation step, try refreshing with a new Excel file stored in SharePoint to check if the issue is tied to the current path.
    • I have tried this and am seeing similar errors. First the file was in a Teams SPO site, and now it's in a native SPO site. Same issue. 

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.