Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
009co
Helper IV
Helper IV

Dataflow Gen 2 fail error message: WriteToDatabaseTableFrom_TransformForOutputToDatabaseTableFrom_20

My Dataflow Gen 2 fail error detail Activities section includes the message:

 

WriteToDatabaseTableFrom_TransformForOutputToDatabaseTableFrom_20

 

With some additional error details:

 

null Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: PowerBIEntityNotFound: PowerBIEntityNotFound Details: Error = PowerBIEntityNotFound;RequestId

 

The query itself returns data. The staging tables contain the query data. 

 

It appears that the dataflow is not able to write the data to the Lakehouse tables and that is where it fails.

 

The data source is either OneDrive csv file retrieved using web connector, but also get same results when data source is the same csv file saved as a Lakehouse Files.

 

The column names contain no spaces or special characters, the query columns all have data types assigned correctly. The data itself is bog standard datetime, strings and decimal numbers. Files have about 50k records, maybe 10 columns. All of the connection authentications are working fine using my Office 365 organizational account.

 

What else can I do or look at to get more detailed diagnostic information to work through these issues? 

2 ACCEPTED SOLUTIONS

Hey Miguel,

 

Yeah it definitely had something to do with the staging artifacts.

 

While troubleshooting / learning how it all works, I was deleting the staging lakehouse and any dataflows (gen 2) and then recreating the dataflows which recreated staging lakehouse. 

 

However, now I am pretty sure that under no circumstances should staging lakehouse (or other staging artifacts) be deleted. I get the idea that they really should be hidden or otherwise considered "system" artifacts and to just let them do their thing.

 

I had to delete my Workspace and create a new one and start all over again to make it work.

 

Can you confirm the above is true?

 

Thanks!

View solution in original post

jwelch
Microsoft Employee
Microsoft Employee

You are correct, the automatically created LH / DW artifacts should not be deleted. Even if you create a new dataflow (which will provision new artifacts) it will leave existing dataflows in a broken state right now.

 

We have several improvements planned for this experience.

View solution in original post

4 REPLIES 4
009co
Helper IV
Helper IV

Here are some troubleshooting observed actions that seem to have helped with this "WriteToDatabaseTableFrom_TransformForOutputToDatabaseTableFrom" type of message error:

 

1. Every time you open Dataflow Gen2 to edit the dataflow query or data destination etc, you should redo the connection details eg click edit connection and go through steps to choose your connection. Maybe not, but seems to me it might forget the connection?

 

2. Ensure your dataflow query results are all assigned data types. 

* Always use datetime not date even if your date is just a date (Delta table feature!)    

* Always assign data type to each column. If you forget to do assign , dataflow assigns type "Any" to the column which cannot be written to a table. The data destination schema page will give you a message to this effect.

* Some steps in query may make a column "forget" its data type if it was assigned in previous steps, so it seems you should ensure that final step assigns all columns their data type. This is the same in Dataflow "gen 1" too. Power BI and Excel queries are a lot more forgiving about this eg they will create the data results and you can use them without any errors (although their data type might not work in relationships or as expected in results and visuals).

 

3. If you change columns or their data types in your dataflow query then you should delete the data destination table from the Lakehouse (and DWH but haven't worked with DWH tables yet) and then run the updated dataflow again to recreate a new updated table. It doesn't appear to update an existing table schema eg add new column or remove or modify existing columns.

 

4. Learn reason dataflow failed. In the Workspace list of objects eg dataflows, lakehouse, report, dataset etc if the dataflow fails there will be red triangle. You can click on this, or click on the 3 dots and open Refresh history and 1) click on the download symbole to get the csv export and see some info on this. Or better let, 2) click on the history date time link to see more details. This is where I see the "WriteToDatabaseTableFrom_TransformForOutputToDatabaseTableFrom" message.

 

5. Also refreshing any of the lists of items and also refreshing the entire browser page really helps to show all most recent status and objects. For example, sometimes a newly added table doesn't show until page is refreshed.

 

6. Also make sure that the data destination table selection for new or existing is correctly set. If you create a new dataflow and set to new, then subsequent runs seem to change it to existing. But when you delete the table so you can create a new table with new columns or modified type columns, then you need to switch it back to new.

 

7. Do not ever delete the Lakehouse or DWH staging artifacts. These should probably be hidden or hideable. They  seem to collect historical content eg retaining previously deleted tables but in a way that each one is unique item. Not sure what happens if it has too much historical data or if that will ever be a problem.

 

This "WriteToDatabaseTableFrom_TransformForOutputToDatabaseTableFrom" type of message error message actually seems quite informative basically that something is preventing dataflow results from being written to whereever you are putting it eg in Lakehouse or DWH.

miguel
Community Admin
Community Admin

Hi!

When it comes to your workspace, do you see a Lakehouse with the word "Staging" available in it? 

 

You can read more about a similar situation from the thread below:

Solved: Re: Dataflow Gen2 Web.Contents : Error : PowerBIEn... - Microsoft Fabric Community

Hey Miguel,

 

Yeah it definitely had something to do with the staging artifacts.

 

While troubleshooting / learning how it all works, I was deleting the staging lakehouse and any dataflows (gen 2) and then recreating the dataflows which recreated staging lakehouse. 

 

However, now I am pretty sure that under no circumstances should staging lakehouse (or other staging artifacts) be deleted. I get the idea that they really should be hidden or otherwise considered "system" artifacts and to just let them do their thing.

 

I had to delete my Workspace and create a new one and start all over again to make it work.

 

Can you confirm the above is true?

 

Thanks!

jwelch
Microsoft Employee
Microsoft Employee

You are correct, the automatically created LH / DW artifacts should not be deleted. Even if you create a new dataflow (which will provision new artifacts) it will leave existing dataflows in a broken state right now.

 

We have several improvements planned for this experience.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.