Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Anonymous
Not applicable

Error: PipelineException: Container exited unexpectedly with code 0x0000DEAD

Hi,

I am getting the above mentioned error while refreshing the dataflow. I have downloaded the log and it says the Pipeline Execution error in one of the dataset.

 

Thanks in advance

3 REPLIES 3
Anonymous
Not applicable

Hi 

 

I keep getting this message when I try to run an ML model. I don't know which column of my dataset is this occuring and why is it occuring.  All my data is correctly loaded with no missing values.

 

Error: PipelineException: We couldn't convert to Number. . RootActivityId = 26f1f32a-601a-47a9-980c-1fd0627d14eb.Param1 = PipelineException: We couldn't convert to Number. Request ID: 94185a3c-47bd-dc64-f70a-13c843930de8.

anatole
Regular Visitor

hi @v-robertq-msft where do we set max memory and container size for dataflows? I don't see such options in the dataflow settings page

v-robertq-msft
Community Support
Community Support

Hi, 

According to your error description, you can try the below suggestions:

There is a semantic difference when the entities are referenced from DF in the same workspace vs DF is a different workspace with respect to the compute engine. In the same workspace case, the dataflows have a strong reference to each other and are updated in the same transaction. Hence we do not need to cache data and can refer to the data from the upstream entity. However, when they come from the different workspaces, the references are weak references and in order to be self-contained within a workspace, we do need to re-cache the data. The re-caching step is what adds additional time in processing.

To mitigate this we suggest

  1. Increase dataflows’ max memory % and decrease the Datasets max memory % by the same quantity. (If they have default settings then increase dataflows to 40% and decrease Datasets to 80%)
  2. Increase Dataflows container size to 1500Mb. This will reduce parallelism and potentially reduce intermittent failures.

 

Here’s a blog to troubleshoot the dataflow refresh, you can check:

https://community.powerbi.com/t5/Community-Blog/Why-is-my-Power-BI-dataflow-refresh-failing/ba-p/623...

 

Thank you very much!

 

Best Regards,

Community Support Team _Robert Qin

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.