Spark-submit 提交任务时候报错java
Exception in thread "main" java.lang.IllegalArgumentException: System memory 202768384 must be at least 4.718592E8. Please use a larger heap size.mysql
/usr/local/app/spark-1.6.1/bin/spark-submit \
--class cn.tbnb1.spark.sql.DataFrameCreate \
--master spark://v1:7077 \
--num-executors 2 \
--driver-memory 100m \
--executor-memory 100m \
--executor-cores 2 \
--files /usr/local/app/hive/conf/hive-site.xml \
--driver-class-path /usr/local/app/hive/lib/mysql-connector-java-5.1.17.jar \
/usr/local/testdata/spark-data/java/sql/jar/spark-demoes.jar \sql
这是脚本app
分析了下 ,我先是加大了虚拟机内存。 可是问题仍是没解决。spa
看来是driver内存不足,当给了 driver的内存尝试着增大到400M 时候xml
仍旧是爆出以下错内存
Exception in thread "main" java.lang.IllegalArgumentException: System memory 402128896 must be at least 4.718592E8. Please use a larger heap size.虚拟机
此时就能够再次调大一些 给了1g(应该是从spark升级1.5或者1.6以后才出现这样的问题,)it
而后再次运行以后正常得出结果spark
还能够指定在代码中 :
val conf = new SparkConf().setAppName("word count")
conf.set("spark.testing.memory", "1g")//后面的值大于512m便可
解决问了,