Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
Anonymous
Not applicable

Evaluation ran out of memory Error

I'm trying to publish a query in dataflow gen2 to a destination in the lakehouse (a certain table), but getting this error after a long waiting time:
"Evaluation ran out of memory"

 

Are there any solutions other than breaking results into smaller sizes? I have to admit that the size of the data is huge and I'm using "pivot" operation in the last step.

6 REPLIES 6
Anonymous
Not applicable

Hi @Anonymous ,

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .

Anonymous
Not applicable

Hi @Anonymous ,

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .

miguel
Community Admin
Community Admin


@Anonymous wrote:

I'm trying to publish a query in dataflow gen2 to a destination in the lakehouse (a certain table), but getting this error after a long waiting time:
"Evaluation ran out of memory"

 

Are there any solutions other than breaking results into smaller sizes? I have to admit that the size of the data is huge and I'm using "pivot" operation in the last step.


Try staging the data right before you do the pivot operation and then create a reference query that only does the pivot and loading the data to your desired destination. That should help tremendously.

Anonymous
Not applicable

I have tried staging the data before pivoting, as shown in the code below, but now I'm getting this error:

AhmedAlShaikh_0-1702499220781.png

 



let

    Source = SharePoint.Files("xxx", [ApiVersion = 15]),
    #"Filtered Rows PATH" = Table.SelectRows(Source, each Text.Contains([Folder Path], "yyy")),
    #"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows PATH",{"Content", "Name"}),
    #"Replaced Value1" = Table.ReplaceValue(#"Removed Other Columns",".csv","",Replacer.ReplaceText,{"Name"}),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Replaced Value1", "Name", Splitter.SplitTextByEachDelimiter({" - "}, QuoteStyle.None, true), {"Name.1", "Name.2"}),
    #"Renamed Columns" = Table.RenameColumns(#"Split Column by Delimiter",{{"Name.1", "Source"}, {"Name.2", "Property"}}),
    #"Added Custom" = Table.AddColumn(#"Renamed Columns", "Custom", each Table.PromoteHeaders(Csv.Document([Content],[Delimiter=",", Columns=4, Encoding=1252, QuoteStyle=QuoteStyle.None]))),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"Month", "Hour", "Name", "Value"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Custom",{"Content"}),
    #"Filtered Rows" = Table.SelectRows(#"Removed Columns", each ([Value] <> null)),
    #"Changed Type" = Table.TransformColumnTypes(#"Filtered Rows",{{"Month", type date}, {"Hour", type time}, {"Name", type text}, {"Value", type number}}),
    #"stagingTable" = Table.Buffer(#"Changed Type")

in

    stagingTable

 

Buffering the table using Table.Buffer is not the same as staging. Staging is a new mechanism only available in Dataflows Gen2. You can learn more about it from the link below:

https://blog.fabric.microsoft.com/blog/data-factory-spotlight-dataflows-gen2/

 

Using Table.Buffer (and other buffer functions) could lead to high memory consumption and the out of memory error that you're seeing.

Anonymous
Not applicable

Hi @Anonymous ,

Thanks for using Fabric Community.

As I understand you are facing an Error - "Evaluation ran out of memory" while working with huge data using Data Flow Gen 2.

Inorder to handle the error, you can try below optimization techniques:

  • Column selection: Analyze your table schema and limit the columns queried and pivoted to only those absolutely needed. Reducing data volume can significantly decrease memory consumption.
  • Filtering: Apply adequate filters before the "pivot" step to exclude irrelevant data from being processed.
  • Caching: Consider caching intermediate results if possible. This can reduce repeated calculations and save memory.

Hope this is helpful. Please let me know if you have further queries.

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

November Update

Fabric Monthly Update - November 2024

Check out the November 2024 Fabric update to learn about new features.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.