Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello,
I am trying to append 30 tables, 20-100MB max each, in Dataflows. I am getting this error: Evaluation resulted in a stack overflow and cannot continue.
The tables are all similar, originating from JSON files.
They all look like this:
let
Source = SharePoint.Files(SharepointSource, [ApiVersion = 15]),
importedfile = Source{[Name="FILE_NAME.Json",#"Folder Path"=SharepointSource & SharepointFolder]}[Content],
importedJSON = Json.Document(importedfile),
#"Converted to Table" = Table.FromList(importedJSON, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
// Convert the first record to a table to extract column names
FirstRecord = #"Converted to Table"{0}[Column1],
FirstRecordTable = Table.FromRecords({FirstRecord}),
ColumnNames = Table.ColumnNames(FirstRecordTable),
// Expand the table using dynamic column names
#"Expanded Table" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", ColumnNames, ColumnNames)
in
#"Expanded Table"
and the append is this:
let
Source = Table.Combine({TABLE1, TABLE2, ETC...}),
TRANSFORMATIONS,
TRANSFORMATIONS,
TRANSFORMATIONS
in
RESULT
- Solutions i've tried:
Buffering each table before appending
During the evaluation, the virtual memory of Dataflow is 6GB. Does that mean that Dataflow doesn't support combining 30 tables TOTALING 1.5GB? That's a huge deal breaker.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.