spark-sql运行脚本报错 tasks bigger than bigger than spark.driver.maxResult

spark-sql执行脚本,导出数据的脚本爆出如下异常; Caused by: org.apache.spark.SparkException:Job aborted due to stage failure: Total size of serialized results of 1212tasks (10300 MB) is bigger than spark.driver.maxResult
相关文章
相关标签/搜索