Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I am getting the below error during scheduled refresh. I am only using Dataflow Gen 2 and we have premium capacity. Why am I getting this and how to resolve. Thanks
"Mashup Exception Data Format Error, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataFormat.Error: Failed to insert a table., Underlying error: Parquet: class parquet::ParquetStatusException (message: 'Out of memory: malloc of size 33561792 failed') Details: Reason = DataFormat.Error;ErrorCode = Lakehouse036;Message = Parquet: class parquet::ParquetStatusException (message: 'Out of memory: malloc of size 33561792 failed');Message.Format = Parquet: class parquet::ParquetStatusException (message: 'Out of memory: malloc of size 33561792"
Solved! Go to Solution.
Hi @Sreejisha,
Thanks for reaching out to the Microsoft fabric community forum.
Based on the error message you've shared, the refresh is failing due to an out-of-memory condition while reading a Parquet file during execution of the Dataflow Gen2. Specifically, the Parquet processing engine attempted to allocate a block of memory (around 33 MB) but was unable to do so. This typically points to a scenario where a large portion of data such as a big column, row group, or a wide table is being loaded into memory and exceeds the job’s memory allocation.
Even though your workspace is on Premium capacity, resources (like memory) for each refresh job are still governed by internal limits. This means that certain complex or large datasets can run into memory issues, especially when working with large Parquet files or when transformations require loading large segments into memory.
Here are a few things you can try to solve your issue:
* If you're generating the Parquet files yourself, consider splitting them into smaller files or using smaller row group sizes to reduce the memory load during processing.
* Also try to reduce the number of columns or apply filters earlier in your query steps to minimize the volume of data processed.
* If the query logic is complex, you can try breaking the dataflow into smaller steps using intermediary dataflows.
* And If possible, review the Parquet files using a notebook or other tool to inspect for unusually large columns, nested structures, or large binary fields.
If the issue continues despite these changes, you can reach out to Microsoft Support by raising a ticket with Microsoft Support. Please refer below link on how to raise a contact support or support ticket.
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
I would also take a moment to thank @DataGuru412, for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Hammad.
Community Support Team
If this post helps then please mark it as a solution, so that other members find it more quickly.
Thank you.
Hi @Sreejisha,
Thanks for reaching out to the Microsoft fabric community forum.
Based on the error message you've shared, the refresh is failing due to an out-of-memory condition while reading a Parquet file during execution of the Dataflow Gen2. Specifically, the Parquet processing engine attempted to allocate a block of memory (around 33 MB) but was unable to do so. This typically points to a scenario where a large portion of data such as a big column, row group, or a wide table is being loaded into memory and exceeds the job’s memory allocation.
Even though your workspace is on Premium capacity, resources (like memory) for each refresh job are still governed by internal limits. This means that certain complex or large datasets can run into memory issues, especially when working with large Parquet files or when transformations require loading large segments into memory.
Here are a few things you can try to solve your issue:
* If you're generating the Parquet files yourself, consider splitting them into smaller files or using smaller row group sizes to reduce the memory load during processing.
* Also try to reduce the number of columns or apply filters earlier in your query steps to minimize the volume of data processed.
* If the query logic is complex, you can try breaking the dataflow into smaller steps using intermediary dataflows.
* And If possible, review the Parquet files using a notebook or other tool to inspect for unusually large columns, nested structures, or large binary fields.
If the issue continues despite these changes, you can reach out to Microsoft Support by raising a ticket with Microsoft Support. Please refer below link on how to raise a contact support or support ticket.
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
I would also take a moment to thank @DataGuru412, for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Hammad.
Community Support Team
If this post helps then please mark it as a solution, so that other members find it more quickly.
Thank you.
@DataGuru412 I don't think, it is similar issue as I am doing a scheduled refresh which works all the time and the message says 'Out of memory: malloc of size 33561792 failed'. I believe fabric can handle more size than that
Hi,
It seems your problem is similar to below link
Give a try. Please.
Thanks
RJ
User | Count |
---|---|
15 | |
15 | |
15 | |
12 | |
8 |
User | Count |
---|---|
38 | |
30 | |
28 | |
24 | |
16 |