Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Dataflow Gen2 to Lakehouse fails with data destination

Hi everyone,

 

i have created a Dataflow Gen2 building up on a Dataflow.

when i do not add a Data destination, the loading process works (and the tables are loaded in the Lakehouse which is automatically created (DataflowsStagingLakehouse). 

However if i add a destination the load fails.

bazi_0-1685596918440.png

 

Error message:

null Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: PowerBIEntityNotFound: PowerBIEntityNotFound Details: Error = PowerBIEntityNotFound

bazi_1-1685597028902.png

 

unfortunately i do not really understand the issue.

Can anyone help?

 

My actuall goal was to load data from an on-premises SQL server into the lakehouse.

I have faced a similar issue like in this post:

https://community.fabric.microsoft.com/t5/Issues/Unable-to-feed-on-premises-SQL-Server-data-into-Lak...

bazi_2-1685597086574.png

 

Error Details: Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: We don't support creating directories in Azure Storage unless they are empty. Details: #table({"Content", "Name", "Extension", "Date accessed", "Date modified", "Date created", "Attributes", "Folder Path"},

 

Any help is appreciated.

 

kind regards

 

1 ACCEPTED SOLUTION
jinbeman
Regular Visitor

Hello,

I was experiencing the same issue, this post helped me to solve the problem.

View solution in original post

11 REPLIES 11
jinbeman
Regular Visitor

Hello,

I was experiencing the same issue, this post helped me to solve the problem.

Anonymous
Not applicable

changing the datatype of the date columns to datetime solved my problem (for the first problem: Dataflow Gen2 building up on a Dataflow) - thanks for the link

Bazi,

I just tried changing the column to datetime, but no luck here. 

 

Received the usual fail message.

 

SFDucati_0-1685976741164.png

 

Anonymous
Not applicable

Hi SFDucati,

this is the error message you receive when you try to connect the "dataflow gen2" with your on-premises data source, right?

try to create an "old" dataflow and connect this one to your on premise table

bazi_0-1685978389472.png


after that create a Dataflow Gen2 and connect it to your "old" Dataflow.
there you should change the column type of the date-columns to datetime with the destination to your Lakehouse


so the road of the data is:

dataflow - dataflow Gen2 - Lakehouse

this worked for me.

Ok,

 

I tried your suggestion.  I created a new dataflow, not a dataflow gen2.  Using the same queiry, it created the table.  I saved and published.  Then I refreshed it, and it worked.  Then I created a new dataflow gen2 that accessed the new dataflow.  The same table content was created in the dataflow gen2.  I then made the data destination my Lakehouse.  Then published.

 

Then the same error. 

 

SFDucati_0-1685981961987.png

 

I am wondering if i have to delete everything and start over.  The only thing that differs from Microsoft's tutorial page is I am using web sources and the tutorial uses their sample data on Github.

Bazi,

 

I can try that workaround and see if it works.  I am actually connected to web based sources not on prem data.  Interestingly, the dataflow gen2 power queries are the exact same queriesI have been using in the Power Apps Dataflows.  Everythng was fine there.   So when I started the Fabric Trial, I just cut and pasted them into the dataflow gen2's.  They data appears fine during the online power query experience, it's just when I publish...it fails.  I worked with support for about 4 days, and they had no answers.  They just said it's a known bug and they are working on a solution.  My issues is whey did they not test this before the rollout?  And why are they continuing to count down on our Trial Days when we can't even get it to work?  

Anonymous
Not applicable

unfortunately i cannot help you in this case.
i guess if Microsoft says it is a known bug we probably have to wait until they fix it.

Aghh...This is so confusing.  I don't even see an option in 'fabric' to create something like an 'old' Dataflow.  Is this a MAPPING Dataflow as in DataFactory or a PowerBI Dataflow?  

I see these as the options to create 'NEW' stuff in either DataFactory area..or whatever it is called...and PowerBI area,   Does anyone else find this layout VERY confusing or am I just a curmodgeon.  Don't answer that. 🙂

 

dhorrall_0-1685979233197.png

 

dhorrall_1-1685979276596.png

 

Anonymous
Not applicable

just enter your workspace and click on the plus, there you can see Dataflow and Dataflow Gen2 (Preview).
Dataflow is what i was refering to as the "old" Dataflow

bazi_0-1686025675101.png


i agree, it is kind of consufing at the beginning

SFDucati
Frequent Visitor

I am having the same issue.  I opened a ticket with Support and they indicated there is a known bug on Dataflows Gen2 causing it not to load to the Lakehouse.   Has anyone solved this issue?  I have been waiting since Monday to find a solution.

 

Hi did you ever solve this? It still appears to be an issue. I have a query that took less than a minute to run but the WriteToDataDestination ran for 8 hours before failing 2023-12-21_10-03-16.png

 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

March 2024  FBC Gallery Image

Fabric Monthly Update - March 2024

Check out the March 2024 Fabric update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Kudoed Authors