Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
michaeltrilling
New Member

Blank Lakehouse Table after Dataflow Gen2 executes from Pipeline

We have an intermittent issue whereby sometimes the Dataflow Gen2 fails to save any records to the Datalake when run within a Pipeline.  The Datalake table is a complete replacement of data each run.  When we runthe DataFlow manually, usually the DataLake table is refreshed correctly.

14 REPLIES 14
michaeltrilling
New Member

Hi Vivien,
Yes, the entire Pipeline finishes successfully including the impacted DataFlow.  Very strange.

Thanks,

Michael

Hello,

Can you describe your workflow in detail, with screenshots of each step, etc., so that we can see if we can identify anything that might be causing the issue?

Have a nice day,

Vivien

Hi Vivien,

Absolutely.  The entire Pipeline is an aggregation of data from Excel files stored in SharePoint.  The Dataflow Gen2 having the issue is pulling some preprocessed data from the Lakehouse and combining it with some attribute data from some Master Data Excel files.

Here is a screenshot of the Dataflow Gen2.  The current query is the one pulling preprocessed data from the Lakehouse.

michaeltrilling_0-1762239762891.png

This is the query that writes data back to the Lakehouse:

michaeltrilling_1-1762239941395.png

The rest of the queries are pulling Master Data from Excel files in Sharepoint that are used in the transformation from the Preprocessed table into the output table:

michaeltrilling_2-1762240086326.png

Please let me know if you would like any further details.

 

Thanks,

Michael

Hi Vivien,

Your archtecture description is accurate.  Here is the Pipeline:

michaeltrilling_0-1762247593086.png

Thanks so much for the support,

Michael

Thank you very much.

When you run the Pipeline, it will start by running MAtching_Tables / Actuals_Fils / AFIS_Projects / Planview Staffing Master Data / Planview Employees.

Once launched, right from the start, if you look at the refresh history for each of these dataflows, you will see the trigger and that it is in progress ?

Can you also send a screenshot of how to call a DataFlow in a Data Pipeline component?

Vivien

Hi Vivien,

Yes, when we run the Pipeline, either manually or on a schedule, the individual dataflows start running based on their dependencies. The Dataflows with success dependencies wait until all predecessors have successfully completed.  We have a three retry on each Dataflow to cover spurious errors.  I am not clear on what new screen shot you are asking for, sorry.  Would you please explain more specifically?

 

Thanks,

Michael

For various reasons, just because the Data Pipeline is running correctly does not mean that it is triggering the underlying DataFlows correctly.

I wanted to highlight the fact that you see successful runs at the pipeline level, but ultimately the DataFlows are not executed.

Because once they are triggered, whether they are triggered by the Pipeline or refreshed by you, the subsequent processing is the same and therefore the end result should not be different.

Hi Vivien,

OK, now I understand, yes, not only does the Pipeline complete successfully, each DataFlow within the Pipeline finishes successfully and shows the Start and End Time of the run.

Very strange indeed,

Michael

So everything is running correctly and you can't see anything happening in LakeHouse? As if it hadn't run?

And are you sure about the connection configuration? That the DataFlows are writing correctly to the LakeHouse you're looking at, etc.?

If you're sure about all that, I don't see where the problem could be coming from.

Hi @michaeltrilling,

Thank you for confirming that the issue is resolved. Please let us know if you need any further assistance from our end, we will be happy to address.

I don't believe the issue is resolved.  Did I hit the wrong button by mistake perhaps?

Hi @michaeltrilling,

Sorry for the confusion. Although everything appears to be running successfully, we are unable to accurately identify the root cause. Therefore, please raise a support ticket using the link below so that the backend engineer can assist you in resolving the issue.

Create a Fabric and Power BI Support Ticket - Power BI | Microsoft Learn

 

Thank you.

Hello,

So, if I understand correctly, there is a DataFlow, which contains numerous queries.

The data sources for these queries are either a lakehouse table or an Excel file, is that right?

Then, the destination for all these processes in the DataFlow are lakehouse tables only?

Is it also possible to have a screenshot of the Pipeline?

Have a nice day,

Vivien

vivien57
Super User
Super User

Hello,

In the DataFlow history, can you see the runs that are triggered with the Pipeline?

Have a nice day,

Vivien

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Free Fabric Certifications

Free Fabric Certifications

Get Fabric certified for free! Don't miss your chance.

January Fabric Update Carousel

Fabric Monthly Update - January 2026

Check out the January 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.