Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi everyone,
i have created a Dataflow Gen2 building up on a Dataflow.
when i do not add a Data destination, the loading process works (and the tables are loaded in the Lakehouse which is automatically created (DataflowsStagingLakehouse).
However if i add a destination the load fails.
Error message:
null Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: PowerBIEntityNotFound: PowerBIEntityNotFound Details: Error = PowerBIEntityNotFound
unfortunately i do not really understand the issue.
Can anyone help?
My actuall goal was to load data from an on-premises SQL server into the lakehouse.
I have faced a similar issue like in this post:
Error Details: Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: We don't support creating directories in Azure Storage unless they are empty. Details: #table({"Content", "Name", "Extension", "Date accessed", "Date modified", "Date created", "Attributes", "Folder Path"},
Any help is appreciated.
kind regards
Solved! Go to Solution.
Hello,
I was experiencing the same issue, this post helped me to solve the problem.
Hello,
I was experiencing the same issue, this post helped me to solve the problem.
changing the datatype of the date columns to datetime solved my problem (for the first problem: Dataflow Gen2 building up on a Dataflow) - thanks for the link
Bazi,
I just tried changing the column to datetime, but no luck here.
Received the usual fail message.
Hi SFDucati,
this is the error message you receive when you try to connect the "dataflow gen2" with your on-premises data source, right?
try to create an "old" dataflow and connect this one to your on premise table
after that create a Dataflow Gen2 and connect it to your "old" Dataflow.
there you should change the column type of the date-columns to datetime with the destination to your Lakehouse
so the road of the data is:
dataflow - dataflow Gen2 - Lakehouse
this worked for me.
Ok,
I tried your suggestion. I created a new dataflow, not a dataflow gen2. Using the same queiry, it created the table. I saved and published. Then I refreshed it, and it worked. Then I created a new dataflow gen2 that accessed the new dataflow. The same table content was created in the dataflow gen2. I then made the data destination my Lakehouse. Then published.
Then the same error.
I am wondering if i have to delete everything and start over. The only thing that differs from Microsoft's tutorial page is I am using web sources and the tutorial uses their sample data on Github.
Bazi,
I can try that workaround and see if it works. I am actually connected to web based sources not on prem data. Interestingly, the dataflow gen2 power queries are the exact same queriesI have been using in the Power Apps Dataflows. Everythng was fine there. So when I started the Fabric Trial, I just cut and pasted them into the dataflow gen2's. They data appears fine during the online power query experience, it's just when I publish...it fails. I worked with support for about 4 days, and they had no answers. They just said it's a known bug and they are working on a solution. My issues is whey did they not test this before the rollout? And why are they continuing to count down on our Trial Days when we can't even get it to work?
unfortunately i cannot help you in this case.
i guess if Microsoft says it is a known bug we probably have to wait until they fix it.
Aghh...This is so confusing. I don't even see an option in 'fabric' to create something like an 'old' Dataflow. Is this a MAPPING Dataflow as in DataFactory or a PowerBI Dataflow?
I see these as the options to create 'NEW' stuff in either DataFactory area..or whatever it is called...and PowerBI area, Does anyone else find this layout VERY confusing or am I just a curmodgeon. Don't answer that. 🙂
just enter your workspace and click on the plus, there you can see Dataflow and Dataflow Gen2 (Preview).
Dataflow is what i was refering to as the "old" Dataflow
i agree, it is kind of consufing at the beginning
I am having the same issue. I opened a ticket with Support and they indicated there is a known bug on Dataflows Gen2 causing it not to load to the Lakehouse. Has anyone solved this issue? I have been waiting since Monday to find a solution.
Hi did you ever solve this? It still appears to be an issue. I have a query that took less than a minute to run but the WriteToDataDestination ran for 8 hours before failing
User | Count |
---|---|
13 | |
12 | |
7 | |
6 | |
5 |