Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.

Reply
JasonArmstrong
New Member

Dataflow Gen2 - Excel in SharePoint to Lakehouse - WriteToDataDestination Error

I have created a Dataflow Gen2 which uses my organisations On-Premises Gateway to connect to a SharePoint Site, navigate to the folder and files and export the excel workbook into Power Query, transform column hears so they're compliant with a Lakehouse's requirements.

 

All of the Excels are incredibly small sets of data and are only provided some currently semi-static unsystemised information that is required for operating our reporting.

 

For context:

  • We use a gateway to connect to the source (This is at site level, not at file level. Does that matter?)
  • The default destination is the lakehouse and update method is set to "Replace" with column mapping set to "Auto mapping and allow for schema change" (though admittedly i do not know how to change this)
  • The Dataflow has 5 seperate queries in it to retrieve the excel data from that SharePoint site. I tested it with a single query and still get the same error. I also tested it with a file which has perfectly named columns already and get the same issue.
  • Staging is turned off for these files

 

When I run this Dataflow for the first time it successfully runs, retreiving the information and create the tables in the Lakehouse.

JasonArmstrong_0-1750161384153.png

 

When I refresh this Dataflow/Republish I experience these errors: 

  1. There was a problem refreshing the dataflow: 'Something went wrong, please try again later. If the error persists, please contact support.'. Error code: ActionUserFailure. (Request ID: 77ff67ee-76c1-4dbc-9e22-29758e763e15).
  2. SP_Excel_ad_narrative_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Error in comitting version., InnerException: We can't update the value. The operation or its arguments aren't supported by the value., Underlying error: We can't update the value. The operation or its arguments aren't supported by the value. Details: Reason = DataSource.Error;Message = We can't update the value. The operation or its arguments aren't supported by the value.;Detail = #table({"Version", "Published", "Data", "Modified"}, {});Message.Format = We can't update the value. The operation or its arguments aren't supported by the value.;ErrorCode = 10555;Microsoft.Data.Mashup.Error.Context = User GatewayObjectId: 'gatewayobjectidremovedforthispost'. Error code: Mashup Exception Data Source Error. (Request ID: 77ff67ee-76c1-4dbc-9e22-29758e763e15).

I for the life of me cannot understand what the issue is and its driving me mad.

 

Is there any way to work around this, ingesting the files from SharePoint via a data pipeline or uploading the files on a schedules and then converting them to tables using a notebook. Some way to avoid this nonsense.

4 REPLIES 4
GilbertQ
Super User
Super User

Hi @JasonArmstrong 

 

What happens if you had to change the destination data flow to simply overwrite? and not to also. set the column tabs to auto and leave them as fixed to see if that first will solve the issue? Then I need to know it's the autumn mapping that is causing the issue

 





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Firstly, thanks for replying and offering your help.

 

To test what you have suggested i created a new flow to ingest one of the excel files into the lakehouse.

Instead of setting the default data destination I manually set it to the lakehouse so I could use the UI interface to tell the flow to replace the data isntead of appending (it was already replacing in the default).

I was unsure what you would suggest when choosing dynamic vs fixed - I chose fixed as the tooltip for dynamic sounded like it could be an issue downstream if changes were ever made to the Excel (would appreciate some context to this decision too, does dynamic mean if I have measures in my custom semantic model they will be deleted if someone changes a column name in an excel?)

 

I left the column names as they were e.g. i kept it as "Output Category" istead of changing to "output_category". I was forced to map the datatypes as the default "Any" was not permitted if using the auto option.

 

The dataflow published successfully the first time, as expected, and then proceeded to refresh successfully. I added a new row to the dataset and it loaded into the lakehouse table successfully.

 

I created another flow to test the same process with "Dynamic" selected and it published successfully creating the tables in the Lakehouse but failed the refresh. So it seems that is the issue, though I don't understand why.

 

Hi @JasonArmstrong , Thank you for reaching out to the Microsoft Community Forum.

 

This is caused by using Dynamic schema with Auto mapping and allow for schema change, which causes the Dataflow to try and rebuild the Lakehouse table during refresh. This often fails when the Excel structure changes even slightly, especially through an On-Premises Gateway.

 

Configure the Dataflow to use a Fixed schema and manually map each column. Set explicit data types, avoid leaving any column as “Any”. Also, make sure the update method is set to Replace, not append. This prevents table recreation and ensures stable refreshes, as your testing already confirmed.

 

If your Excel files are likely to change structure in the future, consider adding a SharePoint landing folder and use a notebook to clean and standardize files before ingestion into the Lakehouse. This gives you control over schema changes without relying on fragile Dynamic settings.

 

If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

Sorry to bring it up again now - 

 

The above, previously working solution now is not working with a new error. I made a post earlier basically confirming it was working only to be met with new errors that started occuring.

 

Steps taken:

  1. Select Dataflow Gen2 (Deselect custom option for CI/CD)
  2. Select Pipeline Data source (SharePoint site)
  3. Navigate to excel file (select "Binary") and import excel workbook
  4. Navigate to excel table (select "Table") and load the excel table
  5. Add data destination
  6. Select my lakehouse
  7. Select new table
  8. untick "Use automatic settings"
  9. Ensure Update Method is "Replace" and Schema options on publish is "Fixed Schema"
  10. App source types in column mapping (only using "Text" and "Decimal")
  11. Save settings
  12. Publish (Succeeds and creates table)
  13. Test refresh (Fails and produces following error)

Error: 56107220-e43d-4cd1-bd8d-86a90f3883c3_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Error in replacing table's content with new data in a version: #{0}., InnerException: #{0} failed to get contents from '#{1}'. Status code: #{2}, description: '#{3}'., Underlying error: AzureDataLakeStorage failed to get contents from 'https://onelake.dfs.fabric.microsoft.com/260e1283-38ee-4a8a-8a97-31969b5893f8/28158a71-a986-4c84-899...'. Status code: 409, description: 'The specified path already exists.'. Details: Reason = DataSource.Error;Message = AzureDataLakeStorage failed to get contents from 'https://onelake.dfs.fabric.microsoft.com/260e1283-38ee-4a8a-8a97-31969b5893f8/28158a71-a986-4c84-899...'. Status code: 409, description: 'The specified path already exists.'.;Detail = [DataSourceKind = "Lakehouse", DataSourcePath = "Lakehouse", DataSourceKind.2 = "AzureDataLakeStorage", DataSourcePath.2 = "https://onelake.dfs.fabric.microsoft.com/260e1283-38ee-4a8a-8a97-31969b5893f8/28158a71-a986-4c84-899...", Url = "https://onelake.dfs.fabric.microsoft.com/260e1283-38ee-4a8a-8a97-31969b5893f8/28158a71-a986-4c84-899..."];Message.Format = #{0} failed to get contents from '#{1}'. Status code: #{2}, description: '#{3}'.;Message.Parameters = {"AzureDataLakeStorage", "https://onelake.dfs.fabric.microsoft.com/260e1283-38ee-4a8a-8a97-31969b5893f8/28158a71-a986-4c84-899...", 409, "The specified path already exists."};ErrorCode = 10266;Microsoft.Data.Mashup.Error.Context = User GatewayObjectId: '---'. Error code: Mashup Exception Data Source Error. (Request ID: da9bb4e4-4b37-4e9a-962e-dc421b725432).

Helpful resources

Announcements
June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.