March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
I have a report utilizing data import from a persisted table in Databricks. Once the dataset size increased I received the following error:
Total size of serialized results of 17 tasks (4.1 GB) is bigger than spark.driver.maxResultSize
Looking up the error I found alot of spark specific posts explaining that spark.driver.maxResultSize is a variable which exists to prevent out of memory exceptions. The reason I'm posting in a Power BI forum is I haven't had any issue interacting with the data (either in munging the data or writing it to hive) on the Databricks side.
Does anybody know some details about how the refresh interacts with spark/Databricks and why it could be causing the issue in this particular situation? I would prefer having some understanding of why it's occurring in this situation before I adjust the maxResultSize variable (possibly several times).
Hi @jabate ,
I think this issue should more related to database settings. it sounds like response data amount is greater than default cache size so refresh requests has been blocked/canceled.
Maybe you can take a look at following link to know more about this issue:
For Power BI Architecture, you can refer to below link:
Regards,
Xiaoxin Sheng
Hi Xiaoxin,
Thanks for looking at this issue. As an update we increased teh variable size to 35gb on both the clusters we are running, but still encounter the same 4gb error when attempting a refresh. We have a ticket in with the dev team to ascertain whether the error is being thrown by our Databricks instance (meaning we missed something in the adjustment of the variable) or whether it's occurring in the attempt to write to our premium capacity storage.
Julius
Hi @jabate ,
I'd like to suggest you open a support ticket to get better support from dev team, I think this issue is more related to spark itself.
Regards,
Xiaoxin Sheng
Thanks Xiaoxin , a ticket is currently in but I have not heard back and need to follow up on it. The changes have been made in spark so I need to confirm with Microsoft support that the issue is not related to the Hive metastore which holds the uploaded PBIX files.
I'll make sure to post their resolution/recommendation once I'm able to get that back on a call and get it sorted out.
Did you manage to figure this out? I am getting the same error.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
23 | |
23 | |
12 | |
11 | |
8 |
User | Count |
---|---|
46 | |
44 | |
24 | |
12 | |
10 |