当前位置:   article > 正文

记录一个因spark不是纯净版导致hive跑spark的insert,update语句报类找不到的问题

记录一个因spark不是纯净版导致hive跑spark的insert,update语句报类找不到的问题

【背景说明】

hive能正常启动,建表没问题,我建了一个student表,没问题,但执行了下面一条insert语句后报如下错误:

  1. hive (default)> insert into table student values(1,'abc');
  2. Query ID = atguigu_20240417184003_f9d459d7-1993-487f-8d5f-eafdb48d94c1
  3. Total jobs = 1
  4. Launching Job 1 out of 1
  5. In order to change the average load for a reducer (in bytes):
  6. set hive.exec.reducers.bytes.per.reducer=<number>
  7. In order to limit the maximum number of reducers:
  8. set hive.exec.reducers.max=<number>
  9. In order to set a constant number of reducers:
  10. set mapreduce.job.reduces=<number>
  11. Failed to monitor Job[-1] with exception 'java.lang.IllegalStateException(Connection to remote Spark driver was lost)' Last known state = SENT
  12. Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
  13. FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. RPC channel is closed.

因为我是yarn跑的,所以去yarn看报错日志。

yarn报错日志如下:

它说是这个类找不到?!

【原因及解决】:其实就是我spark的包弄错了,应该是用纯净版的spark包。重新解压到module目录下,其他操作和spark原来的都一样。

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/花生_TL007/article/detail/458283
推荐阅读
相关标签
  

闽ICP备14008679号