File——Project Structure——Libraries——点+号——点java——选择下载好的spark-assembly-1.5.1-hadoop2.6.0.jar包——点ok
java
依次选择“File”–> “Project Structure” –> “Artifact”,选择“+”–> “Jar” –> “From Modules with dependencies”,选择main函数,并在弹出框中选择输出jar位置,并选择“OK”。
最后依次选择“Build”–> “Build Artifact”编译生成jar包。具体以下图所示。python
hadoop@master:~/wujiadong$ spark-submit --class wujiadong.spark.WordCount --executor-memory 500m --total-executor-cores 2 /home/hadoop/wujiadong/wujiadong.spark.jar hdfs://master:9000/wordcount.txt 17/02/02 20:27:34 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/02/02 20:27:37 INFO Slf4jLogger: Slf4jLogger started 17/02/02 20:27:37 INFO Remoting: Starting remoting 17/02/02 20:27:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.1.131:52310] 17/02/02 20:27:41 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set. 17/02/02 20:27:44 INFO FileInputFormat: Total input paths to process : 1 17/02/02 20:27:51 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 17/02/02 20:27:51 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 17/02/02 20:27:51 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 17/02/02 20:27:51 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 17/02/02 20:27:51 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id (spark,1) (wujiadong,1) (hadoop,1) (python,1) (hello,4) 17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
参考资料1apache
参考资料2intellij-idea