Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi all
I have a table I need to duplicate several times, filter different things and doing little different bits of data manipulation, and then re-append them all into a final master data table.
The first round of this, where I duplicated 3 times for 4 tables, seems to be working fine. I have a 2.4MB table I make the copies of, I do some manipulation and append them all back together and it comes out to just under 10MB, All good. I need to break it into 2 because the fifth table I'm about to create has formulas which need this overall appended table to be fully complete.
When I add the fifth table and "close and apply", the table goes up into multiple GB in size (8GB at time of writing this and it's still going) - this is both the fifth individual duplicated table and the new master table.
Is there some common mistake I might be making that's causing this? At no point did my formulas result in a "Table" result in a column that I had to expand, I've literally just done a few lookups and some replace values and custom if...then columns in the fifth table, I just don't understand why the size of the table is going so high...
Solved! Go to Solution.
Found it - it was the List.PositionOf lines. I replaced them with standard table merges and suddenly the table size has gone from 9.2GB to 40MB and the refresh is quick again.
The theory behind it is discussed a bit in this post and this is what led me to those lines being the problem:
Solved: List.PositionOf > performance than Merge Queries? - Microsoft Fabric Community
Thanks again @Ritaf1983 for getting me thinking in the right directions, even if I never gave you a chance to identify the specific problem before I found it on my own
Found it - it was the List.PositionOf lines. I replaced them with standard table merges and suddenly the table size has gone from 9.2GB to 40MB and the refresh is quick again.
The theory behind it is discussed a bit in this post and this is what led me to those lines being the problem:
Solved: List.PositionOf > performance than Merge Queries? - Microsoft Fabric Community
Thanks again @Ritaf1983 for getting me thinking in the right directions, even if I never gave you a chance to identify the specific problem before I found it on my own
Thanks @Ritaf1983
Here is the code for the extra bits I'm doing to the 5th table that aren't in what I did for the other 4:
#"Filtered Rows" = Table.SelectRows(#"Expanded SSLookup", each ([ASSESSMENT_MAIN] = "YES") and ([ASSESSMENT_SPLIT_SUBJECT] = "MATH" or [ASSESSMENT_SPLIT_SUBJECT] = "READ" or [ASSESSMENT_SPLIT_SUBJECT] = "WRIT")),
#"Replaced Value7" = Table.ReplaceValue(#"Filtered Rows","READ","COMB",Replacer.ReplaceText,{"ASSESSMENT_SPLIT_SUBJECT"}),
#"Replaced Value8" = Table.ReplaceValue(#"Replaced Value7","WRIT","COMB",Replacer.ReplaceText,{"ASSESSMENT_SPLIT_SUBJECT"}),
#"Replaced Value9" = Table.ReplaceValue(#"Replaced Value8","MATH","COMB",Replacer.ReplaceText,{"ASSESSMENT_SPLIT_SUBJECT"}),
#"Removed Columns1" = Table.RemoveColumns(#"Replaced Value9",{"STAFF_UNIQUE_ID", "ASSESSMENT_CODE_ASSESSMENTNAME", "GPS_MAX", "GRADE_NAME", "NON_SUBMISSION_REASON", "STAFF_ID", "ASSESSMENT_SPLIT_ASSESSTYPE", "ASSESSMENT_SPLIT_BOUNDARYLOW", "ASSESSMENT_SPLIT_BOUNDARYHIGH", "ASSESSMENT_GRADE_FINAL", "ASSESSMENT_MAIN", "SUBTYPESCORE", "ASSESSMENT_SS", "PERIOD_START", "PERIOD_END", "GRADE_SET_CODE", "GRADE_POINTS", "GPS_MIN", "ASSESSMENT_CODE_TESTNAME", "SUMMATIVE_ASSESSMENT_MARK_ID", "SUMMATIVE_ASSESSMENT_MARK_UNIQUE_ID", "ASSESSMENT_ID", "ASSESSMENT_UNIQUE_ID", "SUBJECT_CODE", "SUBJECT_NAME", "ASSESSMENT_CODE", "ASSESSMENT_NAME", "ASSESSMENT_DATE"}),
#"Removed Duplicates" = Table.Distinct(#"Removed Columns1"),
#"Added Custom2" = Table.AddColumn(#"Removed Duplicates", "COMB_READ_LOOKUP", each [STUDENT_UNIQUE_ID]&[ASSESSMENT_SPLIT_YEAR]&[ASSESSMENT_SPLIT_DC]&"READYES"),
#"Added Custom4" = Table.AddColumn(#"Added Custom2", "COMB_WRIT_LOOKUP", each [STUDENT_UNIQUE_ID]&[ASSESSMENT_SPLIT_YEAR]&[ASSESSMENT_SPLIT_DC]&"WRITYES"),
#"Added Custom5" = Table.AddColumn(#"Added Custom4", "COMB_MATH_LOOKUP", each [STUDENT_UNIQUE_ID]&[ASSESSMENT_SPLIT_YEAR]&[ASSESSMENT_SPLIT_DC]&"MATHYES"),
#"Added Custom6" = Table.AddColumn(#"Added Custom5", "READ_GRADE", each try SummativeAssessmentWIP2[ASSESSMENT_GRADE_FINAL]{List.PositionOf(SummativeAssessmentWIP2[COMB_LOOKUP],[COMB_READ_LOOKUP],0)} otherwise null),
#"Added Custom7" = Table.AddColumn(#"Added Custom6", "WRIT_GRADE", each try SummativeAssessmentWIP2[ASSESSMENT_GRADE_FINAL]{List.PositionOf(SummativeAssessmentWIP2[COMB_LOOKUP],[COMB_WRIT_LOOKUP],0)} otherwise null),
#"Added Custom8" = Table.AddColumn(#"Added Custom7", "MATH_GRADE", each try SummativeAssessmentWIP2[ASSESSMENT_GRADE_FINAL]{List.PositionOf(SummativeAssessmentWIP2[COMB_LOOKUP],[COMB_MATH_LOOKUP],0)} otherwise null),
#"Added Custom9" = Table.AddColumn(#"Added Custom8", "ASSESSMENT_SPLIT_ASSESSTYPE", each "C"),
#"Added Custom10" = Table.AddColumn(#"Added Custom9", "ASSESSMENT_GRADE_FINAL", each if [READ_GRADE] = null or [WRIT_GRADE] = null or [MATH_GRADE] = null then null else if [READ_GRADE] = "WB" or [WRIT_GRADE] = "WB" or [MATH_GRADE] = "WB" then "WB" else if [READ_GRADE] = "WT" or [WRIT_GRADE] = "WT" or [MATH_GRADE] = "WT" then "WT" else if [READ_GRADE] = "EXP" or [WRIT_GRADE] = "EXP" or [MATH_GRADE] = "EXP" then "EXP" else "GD"),
#"Added Custom11" = Table.AddColumn(#"Added Custom10", "COMB_COMBINATION_EXP", each (if [READ_GRADE] = "EXP" or [READ_GRADE] = "GD" then "R" else "") & (if [WRIT_GRADE] = "EXP" or [WRIT_GRADE] = "GD" then "W" else "") & (if [MATH_GRADE] = "EXP" or [MATH_GRADE] = "GD" then "M" else "")),
#"Added Custom12" = Table.AddColumn(#"Added Custom11", "COMB_COMBINATION_GD", each (if [READ_GRADE] = "GD" then "R" else "") & (if [WRIT_GRADE] = "GD" then "W" else "") & (if [MATH_GRADE] = "GD" then "M" else "")),
#"Removed Columns2" = Table.RemoveColumns(#"Added Custom12",{"COMB_READ_LOOKUP", "COMB_WRIT_LOOKUP", "COMB_MATH_LOOKUP", "READ_GRADE", "WRIT_GRADE", "MATH_GRADE"})
The most demanding things I can see in that list would be a dedupe (down from 4200ish rows to 900 rows) and a few lookups using List.PositionOf - everything else is standard if then within the table with very basic conditions and results.
The table that comes out the other side also makes sense in terms of row count (about 20,000 rows) and looking through it I can seee anything there that would be causing it to balloon to 9.2GB (that's where it stopped in the end last night). I have used a merge tables in the earlier 4 into 1 consolidation but that's a single column being brought in, and is only showing as 10MB and completes quickly in the other mid-way calculation table.
Edit - have done a full PC restart - the pbix file sizes for the model before and after these additions has barely changed (189MB to 190MB), the main table I'm appending onto is refreshing as just under 10MB with 19,500 rows and the COMB table (the one with the code above) and the final master table (the one with these two tables appended together and no other code) are 9.2GB and 900 / 20,400 rows respectively. Just in case that helps narrow it down at all...
It’s quite surprising that such small tables could result in an 8GB size, even if performing a Cartesian multiplication (which would involve a merge rather than an append).
Here are some possible reasons for this unexpected behavior:
Without seeing the file and understanding the exact steps, it’s hard to provide a definitive answer. I’d recommend reviewing each step in the process to identify where this dramatic size increase is happening.
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Check out the July 2025 Power BI update to learn about new features.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
72 | |
70 | |
37 | |
29 | |
26 |
User | Count |
---|---|
91 | |
49 | |
44 | |
38 | |
37 |