当前位置:   article > 正文

2020-12-04_warn shutdownhookmanager: shutdownhook '$anon$2' f

warn shutdownhookmanager: shutdownhook '$anon$2' failed,

java.io.IOException: Failed to delete: C:\Users\DELL\AppData\Local\Temp\spark-c7e93bb8-2f5a-4265-a608-185624b0f906
    at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1031)
    at org.apache.spark.util.ShutdownHookManager

anonfun$1
anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
    at org.apache.spark.util.ShutdownHookManager
anonfun$1
anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
    at org.apache.spark.util.ShutdownHookManager
anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)atorg.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)atorg.apache.spark.util.SparkShutdownHookManager
anonfun$runAll$1
anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManager
anonfun$runAll$1
anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManager
anonfun$runAll$1
anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1954)atorg.apache.spark.util.SparkShutdownHookManager
anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager
anonfun$runAll$1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManager
anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
    at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
 

 

解决办法:

去掉这个集群方式下的参数设置   .setJars(List("D:\\code\\Test\\TestSpark\\out\\artifacts\\TestSpark_jar\\TestSpark.jar"))
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/知新_RL/article/detail/265944
推荐阅读
相关标签
  

闽ICP备14008679号