当前位置:   article > 正文

Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkCont

only one sparkcontext should be running in this jvm (see spark-2243).the cur

错误信息:

  1. scala> import org.apache.spark._
  2. import org.apache.spark._
  3. scala> import org.apache.spark.streaming._
  4. import org.apache.spark.streaming._
  5. scala> val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")20/02/09 15:11:31 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
  6. conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@78168463
  7. scala> val ssc = new StreamingContext(conf, Seconds(1))
  8. org.apache.spark.SparkException: Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkContext was created at:
  9. org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:887)
  10. org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
  11. <init>(<console>:15)
  12. <init>(<console>:42)
  13. <init>(<console>:44)
  14. .<init>(<console>:48)
  15. .<clinit>(<console>)
  16. .$print$lzycompute(<console>:7)
  17. .$print(<console>:6)
  18. $print(<console>)
  19. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  20. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  21. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  22. java.lang.reflect.Method.invoke(Method.java:498)
  23. scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
  24. scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
  25. scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
  26. scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
  27. scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
  28. scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
  29. at org.apache.spark.SparkContext$.$anonfun$assertNoOtherContextIsRunning$2(SparkContext.scala:2548)
  30. at scala.Option.foreach(Option.scala:407)
  31. at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2545)
  32. at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2622)
  33. at org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
  34. at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:850)
  35. at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
  36. ... 51 elided

错误原因:

在这个JVM中只能有一个SparkContext运行,当前运行的SparkContext已经被Spark-shell创建了。

解决办法:


val ssc = new StreamingContext(conf, Seconds(1))
改为
val ssc = new StreamingContext(sc, Seconds(1))

例如:

  1. scala> import org.apache.spark._
  2. import org.apache.spark._
  3. scala> import org.apache.spark.streaming._
  4. import org.apache.spark.streaming._
  5. scala> import org.apache.spark.streaming.StreamingContext._
  6. import org.apache.spark.streaming.StreamingContext._
  7. scala> val ssc = new StreamingContext(sc, Seconds(1))
  8. ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@1ef50c3b
  9. scala> val ssc = new StreamingContext(sc, Seconds(1))
  10. ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@680a5de1

 

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/AllinToyou/article/detail/401174
推荐阅读
相关标签
  

闽ICP备14008679号