SparkStreaming报错: Only one SparkContext may be running in this JVM (see SPARK-2243)

报错信息:

Exception in thread "main" org.apache.spark.SparkException: 
Only one SparkContext may be running in this JVM (see SPARK-2243). 
To ignore this error, set spark.driver.allowMultipleContexts = true. 
The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext

缘由:

出现这个问题的缘由就是你建立了多个 SparkContext ,就像下面这种用法,只须要干掉J avaSparkContext 就可:java

SparkConf conf = new SparkConf()
   			.setAppName(“myapplication”).setMaster(“local[4]);
JavaSparkContext jsc = new JavaSparkContext(conf);
JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));

解决这个问题两种方式:

方式1:

SparkConf conf = new SparkConf()
				.setAppName(“myapplication”)  .setMaster(“local[4]);
 JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));

方式2:

SparkConf conf = new SparkConf()
.setAppName(“myapplication”).setMaster(“local[4]);
JavaSparkContext jsc = new JavaSparkContext(conf);
JavaStreamingContext stream = new JavaStreamingContext(jsc, Durations.seconds(10));