Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi all,
I have a Dataflow which stages the original tables by grabbing CSV files from the Lakehouse files system, Transforming, and then creating new tables into the Lakehouse. It was designed as the once-off creation of the tables and works perfectly.
The issue is with the monthly update Dataflow which successfully grabs OData and Transforms it (I can see the data correctly in the table views).
There are 8 queries, each correctly displaying last month's data ready to be appended to the Lakehouse tables.
Each query has a destination to the corresponding lakehouse table mapped correctly and the destination configuration looks like the image below.
However, when I publish the Dataflow, the data isn't appended, and no error is thrown. I have also just dumped this Dataflow into a Pipeline and run it from there to no avail. No error is displayed. But querying the Lakehouse table isn't showing any appended data.
Solved! Go to Solution.
Hi,
I raised a ticket for another issue I was having where a Pipeline was not successfully starting a notebook step to work on the same datalake instance.
The support engineers pointed out this article Microsoft Fabric - Microsoft Fabric Community about an error with a "Livy session".
I ended up:
As various notebooks and dataflows were working individually in the old Datalake environment I couldn't really fault the Datalake instance. But rebuilding it from scratch has everything now behaving as expected.
So the multi-query dataflow appending to multiple datalake tables is now working.
successful pipeline run
Hi,
I raised a ticket for another issue I was having where a Pipeline was not successfully starting a notebook step to work on the same datalake instance.
The support engineers pointed out this article Microsoft Fabric - Microsoft Fabric Community about an error with a "Livy session".
I ended up:
As various notebooks and dataflows were working individually in the old Datalake environment I couldn't really fault the Datalake instance. But rebuilding it from scratch has everything now behaving as expected.
So the multi-query dataflow appending to multiple datalake tables is now working.
successful pipeline run
Hi @JimAu ,
Glad to know your issue got resolved.
Please continue using Fabric Community for help regarding your issues.
So, it occurred to me to try the Append step one query at a time in an individual Dataflow. It worked.
This is despite the original Dataflow which ETL'd CSVs into Datalake tables and contained about 10 queries working just fine.
So my immediate solution is rebuilding 7 more Dataflows with each individual query containing a mapped Lakehouse Append Destination and hoping that solves the immediate issue I have.
Not sure this is by design though.
Hi !
This might require a deeper investigation from our engineering team on how your dataflow has been created and the logic behind it to properly understand what might be happening. If its a bug, we will definitely would like to know and properly address it.
Please go ahead and raise a support ticket to reach our support team:
Hi @JimAu ,
I would like to check whether you got a chance to create a support ticket.
If yes, it would be a great help if you can provide the details of the Support Ticket so we can track it for further information.
Thanks
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!