Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hi,
I am getting below error when trying to refresh data from Spark (Azure Databricks)
Earlier the model and dataset worked fine, when we pushed some more data in Spark and I tried refreshing I am getting this error.
Any suggestion would be highly appreciated. Thanks.
ODBC: ERROR [HY000] [Microsoft][Hardy] (35) Error from server: error code: '0' error message: 'org.apache.spark.SparkException: Job aborted due to stage failure: Total size of serialized results of 85 tasks (4.0 GB) is bigger than spark.driver.maxResultSize (4.0 GB)'. Table:
Solved! Go to Solution.
@SyedAli ,
Could you check if the database driver has limited the refresh data size? Please also refer to the similar case below:
https://community.powerbi.com/t5/Service/Azure-Data-Bricks-Data-Refresh/m-p/643085
Community Support Team _ Jimmy Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
@SyedAli ,
Could you check if the database driver has limited the refresh data size? Please also refer to the similar case below:
https://community.powerbi.com/t5/Service/Azure-Data-Bricks-Data-Refresh/m-p/643085
Community Support Team _ Jimmy Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.