Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
AhmedAlShaikh
Frequent Visitor

Evaluation ran out of memory Error

I'm trying to publish a query in dataflow gen2 to a destination in the lakehouse (a certain table), but getting this error after a long waiting time:
"Evaluation ran out of memory"

 

Are there any solutions other than breaking results into smaller sizes? I have to admit that the size of the data is huge and I'm using "pivot" operation in the last step.

6 REPLIES 6
v-gchenna-msft
Community Support
Community Support

Hi @AhmedAlShaikh ,

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .

Hi @AhmedAlShaikh ,

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .

miguel
Community Admin
Community Admin


@AhmedAlShaikh wrote:

I'm trying to publish a query in dataflow gen2 to a destination in the lakehouse (a certain table), but getting this error after a long waiting time:
"Evaluation ran out of memory"

 

Are there any solutions other than breaking results into smaller sizes? I have to admit that the size of the data is huge and I'm using "pivot" operation in the last step.


Try staging the data right before you do the pivot operation and then create a reference query that only does the pivot and loading the data to your desired destination. That should help tremendously.

I have tried staging the data before pivoting, as shown in the code below, but now I'm getting this error:

AhmedAlShaikh_0-1702499220781.png

 



let

    Source = SharePoint.Files("xxx", [ApiVersion = 15]),
    #"Filtered Rows PATH" = Table.SelectRows(Source, each Text.Contains([Folder Path], "yyy")),
    #"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows PATH",{"Content", "Name"}),
    #"Replaced Value1" = Table.ReplaceValue(#"Removed Other Columns",".csv","",Replacer.ReplaceText,{"Name"}),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Replaced Value1", "Name", Splitter.SplitTextByEachDelimiter({" - "}, QuoteStyle.None, true), {"Name.1", "Name.2"}),
    #"Renamed Columns" = Table.RenameColumns(#"Split Column by Delimiter",{{"Name.1", "Source"}, {"Name.2", "Property"}}),
    #"Added Custom" = Table.AddColumn(#"Renamed Columns", "Custom", each Table.PromoteHeaders(Csv.Document([Content],[Delimiter=",", Columns=4, Encoding=1252, QuoteStyle=QuoteStyle.None]))),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"Month", "Hour", "Name", "Value"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Custom",{"Content"}),
    #"Filtered Rows" = Table.SelectRows(#"Removed Columns", each ([Value] <> null)),
    #"Changed Type" = Table.TransformColumnTypes(#"Filtered Rows",{{"Month", type date}, {"Hour", type time}, {"Name", type text}, {"Value", type number}}),
    #"stagingTable" = Table.Buffer(#"Changed Type")

in

    stagingTable

 

Buffering the table using Table.Buffer is not the same as staging. Staging is a new mechanism only available in Dataflows Gen2. You can learn more about it from the link below:

https://blog.fabric.microsoft.com/blog/data-factory-spotlight-dataflows-gen2/

 

Using Table.Buffer (and other buffer functions) could lead to high memory consumption and the out of memory error that you're seeing.

v-gchenna-msft
Community Support
Community Support

Hi @AhmedAlShaikh ,

Thanks for using Fabric Community.

As I understand you are facing an Error - "Evaluation ran out of memory" while working with huge data using Data Flow Gen 2.

Inorder to handle the error, you can try below optimization techniques:

  • Column selection: Analyze your table schema and limit the columns queried and pivoted to only those absolutely needed. Reducing data volume can significantly decrease memory consumption.
  • Filtering: Apply adequate filters before the "pivot" step to exclude irrelevant data from being processed.
  • Caching: Consider caching intermediate results if possible. This can reduce repeated calculations and save memory.

Hope this is helpful. Please let me know if you have further queries.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

March 2024 FBC Gallery Image

Fabric Monthly Update - March 2024

Check out the March 2024 Fabric update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.