Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Meister_Kaio
Frequent Visitor

Dataflow Gen2 publish fails, but still loads parquet files to destionation lakehouse

Hello!

I am attempting to replicate a medallion architecture in fabric using DataFlows and Lakehouses.

Initially, my first Dataflow executed flawlessly and generated tables in the bronze lakehouse. However, my current objective is to transfer these tables from the bronze lakehouse to the silver lakehouse.

 

After publishing the flow, I encountered an issue where three out of six queries within the flow failed, resulting in an error message.

 

Upon inspecting my refreshed silver lakehouse, I discovered that all six tables from the bronze lakehouse had been loaded as parquet files in the silver lakehouse. Normally, the flow would load tables, not parquet files, to my lakehouse.

 

 

Meister_Kaio_4-1689255461569.png

 

I am now curious as to why the flow proceeded to load the files despite the failure to publish in the correct manner.

 

Thanks!

 

 

 

 

 

 



8 REPLIES 8
st_0999
Helper II
Helper II

I had exactly the same error. 

I removed Enable Staging from all tables, but when I tried loading the last table to a Lakehouse, got this error:

 

MashupException.Error: DataSource.Error: Error in replacing table's content with new data in a version: #{0}. Details: Message = We can't insert null data into a non-nullable column.;Message.Format = We can't insert null data into a non-nullable column.

 

I've specified the datatype for all columns, but a few columns do have null values in them (on purpose). But I guess, null values should still be allowed. Otherwise that's very strange. Is this causing this? Is there anyway around this?

SinghIsKing
Frequent Visitor

I am also facing this issue. Dataflow fails but files are geneated in Lakehouse

miguel
Community Admin
Community Admin

Dataflows and the output destination create tables in a lakehouse, but not raw files.

 

are you able to repro this behavior in a new workspace ? If yes, could you please share the steps to repro the scenario ?

Meister_Kaio
Frequent Visitor

Small Update:
Upon refreshing the silver lakehouse, I observed that all the files were successfully transferred to tables. The queries that encountered failures in the dataflow do not contain any content, while the queries that successfully passed through display their contents.

 

 

Meister_Kaio_0-1689317602078.png

 

miguel
Community Admin
Community Admin

hey! could you please share the error messages from the refresh history of the dataflow in question that tries to load the data to your silver lakehouse?

The error from the dataflow:
Mashup Exception Data Source Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Microsoft SQL: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.) Details: DataSourceKind = Lakehouse;DataSourcePath = Lakehouse;Message = A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.);ErrorCode = -2146232060;Number = 10054;Class = 20

I encountered the same issue with other dataflows, that had an other error message:


Mashup Exception Expression Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Expression.Error: The column "[columnName]" from the table "[tableName]" has not been found. Details: "[columnName]" 

I have already discovered solutions to the mentioned errors. However, I believe that my issue is unrelated to those errors, as it has occurred with various different errors.

Are you using a gateway? if yes, please check out this article and let us know if it helps:

On-premises data gateway considerations for output destinations in Dataflow Gen2 - Microsoft Fabric ...

I'm not using a gateway.
My workound is to disable staging. Seems to work fine if i disbale staging for all queries except  for one.

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors