Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: org.apache.spark.api.java.JavaSparkContext
出现这个问题的缘由就是你建立了多个 SparkContext
,就像下面这种用法,只须要干掉J avaSparkContext
就可:java
SparkConf conf = new SparkConf() .setAppName(“myapplication”).setMaster(“local[4]”); JavaSparkContext jsc = new JavaSparkContext(conf); JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));
SparkConf conf = new SparkConf() .setAppName(“myapplication”) .setMaster(“local[4]”); JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));
SparkConf conf = new SparkConf() .setAppName(“myapplication”).setMaster(“local[4]”); JavaSparkContext jsc = new JavaSparkContext(conf); JavaStreamingContext stream = new JavaStreamingContext(jsc, Durations.seconds(10));