赞
踩
系统:Ubuntu20.04
Hadoop版本:Hadoop3.3.1
Hbase版本:Hive3.1.2
在启动 Hive 时,出现如下输出:
hadoop@fzqs-Laptop:/usr/local/hive/lib$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
原因很简单,Hive 是自带有 Hadoop 中的相关依赖文件的,在本地装好 Hive 后,Hadoop 中的文件与 Hive 中的文件都有相关类的定义,从而导致如上输出。
删除 Hadoop(或 Hive)中相重复定义的文件,如上一条输出如下:
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!
/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!
/org/slf4j/impl/StaticLoggerBinder.class]
标红地方即是重复文件的位置,进入该位置:
cd /usr/local/hive/lib
ls slf4j*
输出如下,
hadoop@fzqs-Laptop:/usr/local/hadoop/share/hadoop/common/lib$ ls slf4j*
log4j-slf4j-impl-2.10.0.jar
删除含有 logs 的文件即可:
sudo rm ./log4j-slf4j-impl-2.10.0.jar
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。