spark 命令行启动

1.start-dfs.sh 2./home/hadoop/apps/spark-1.6.1-bin-hadoop2.6/sbin/start-all.sh 3./home/hadoop/apps/spark-1.6.1-bin-hadoop2.6/bin/spark-shell --master spark://hadoop01:7077 --executor-memory 1g
相关文章
相关标签/搜索