赞
踩
工作需要研究hive功能,线上环境不能动,搭建单机版hadoop+hive测试环境,使用当前最新hadoop 3.3.6+ hive 3.1.3。
centos7
/usr/java/
/opt/module/
/opt/module/
准备java8安装包:jdk-8u201-linux-x64.tar.gz
解压安装:mkdir -p /usr/java && tar -zxvf jdk-8u201-linux-x64.tar.gz -C /usr/java/
配置环境变量,编辑文件:vi /etc/profile.d/my_env.sh
增加jdk环境变量
#JAVA_HOME export是全局变量
export JAVA_HOME=/usr/java/jdk1.8.0_201
export PATH=$PATH:$JAVA_HOME/bin
刷新配置 source /etc/profile
验证 java -version
mkdir -p /opt/module/ && tar -zxvf hadoop-3.3.4.tar.gz -C /opt/module/
vi /opt/module/hadoop-3.3.6/etc/hadoop/core-site.xml
,configuration
节点间增加主机和用户组权限配置,完整内容如下:<configuration>
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
</configuration>
# 解压安装包 tar -xvf mysql-*-1.el7.x86_64.rpm-bundle.tar # 安装 rpm -ivh mysql-community-common-*-1.el7.x86_64.rpm --nodeps --force && rpm -ivh mysql-community-libs-*-1.el7.x86_64.rpm --nodeps --force && rpm -ivh mysql-community-client-*-1.el7.x86_64.rpm --nodeps --force && rpm -ivh mysql-community-server-*-1.el7.x86_64.rpm --nodeps --force # 验证,查看安装mysql rpm -qa | grep mysql # 输入以下命令,完成mysql初始化 mysqld --initialize # 设置目录权限 chown mysql:mysql /var/lib/mysql -R # 启动服务和开机启动 systemctl start mysqld.service && systemctl enable mysqld # 查看默认初始化密码 cat /var/log/mysqld.log | grep password # 重置root密码并设置root账号远程登录 mysql -uroot -p --connect-expired-password -e "alter user 'root'@'localhost' identified by '12WE#o89T'; create user 'root'@'%' identified by '12WE#o89T'; USE mysql; grant all on *.* TO 'root'@'%'; FLUSH PRIVILEGES;"
mkdir -p /opt/module/ && tar -zxvf apache-hive-3.1.3-bin.tar.gz -C /opt/module/
vi /etc/profile.d/my_env.sh
,增加如下内容export HIVE_HOME=/opt/module/apache-hive-3.1.3-bin
export PATH=$PATH:$HIVE_HOME/bin
source /etc/profile
cp /opt/module/apache-hive-3.1.3-bin/conf/hive-default.xml.template /opt/module/apache-hive-3.1.3-bin/conf/hive-site.xml
vi /opt/module/apache-hive-3.1.3-bin/conf/hive-site.xml
,添加如下内容<!-- mysql连接地址 --> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://localhost:3306/hive?&createDatabaseIfNotExist=true&characterEncoding=UTF-8&useSSL=false&allowPublicKeyRetrieval=true</value> </property> <!-- mysql用户名 --> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>root</value> </property> <!-- mysql密码 --> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>12WE#o89T</value> </property> <!-- mysql连接驱动类,这里用mysql8,mysql5是:com.mysql.jdbc.Driver --> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.cj.jdbc.Driver</value> </property> <property> <name>datanucleus.schema.autoCreateAll</name> <value>true</value> </property> <property> <name>hive.metastore.schema.verification</name> <value>false</value> </property> <!-- hive登录用户名 --> <property> <name>hive.server2.thrift.client.user</name> <value>root</value> <description>Username to use against thrift client</description> </property> <!-- hive登录用户密码 --> <property> <name>hive.server2.thrift.client.password</name> <value>1234</value> <description>Password to use against thrift client</description> </property> <!-- 数据缓存目录 --> <property> <name>hive.exec.local.scratchdir</name> <value>/tmp/hive</value> <description>Local scratch space for Hive jobs</description> </property> <property> <name>hive.downloaded.resources.dir</name> <value>/tmp/hive/resources</value> <description>Temporary local directory for added resources in the remote file system.</description> </property>
chmod 777 /tmp/hive
cp /opt/module/apache-hive-3.1.3-bin/conf/hive-env.sh.template /opt/module/apache-hive-3.1.3-bin/conf/hive-env.sh
vi /opt/module/apache-hive-3.1.3-bin/conf/hive-env.sh
HADOOP_HOME=/opt/module/hadoop-3.3.6
export HIVE_CONF_DIR=/opt/module/apache-hive-3.1.3-bin/conf
wget https://repo1.maven.org/maven2/mysql/mysql-connector-java/8.0.30/mysql-connector-java-8.0.30.jar
,放在 cp mysql-connector-java-*.jar /opt/module/apache-hive-3.1.3-bin/lib && ls /opt/module/apache-hive-3.1.3-bin/lib/mysql-connector-java-*.jar
/opt/module/apache-hive-3.1.3-bin/bin/schematool -initSchema -dbType mysql
Initialization script completed
schemaTool completed
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Exception in thread "main" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8 at [row,col,system-id]: [3215,96,"file:/opt/module/apache-hive-3.1.3-bin/conf/hive-site.xml"] at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3101) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:3050) at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2923) at org.apache.hadoop.conf.Configuration.addResourceObject(Configuration.java:1035) at org.apache.hadoop.conf.Configuration.addResource(Configuration.java:940) at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5154) at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5107) at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96) at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:328) at org.apache.hadoop.util.RunJar.main(RunJar.java:241)
解决方法,删除:
file:/opt/module/apache-hive-3.1.3-bin/conf/hive-site.xml
3215行注释即可,原因是xml解析异常。
创建日志目录:mkdir -p /opt/module/logs/
,后台启动 hiveserver2 服务:nohup hive --service hiveserver2 > /opt/module/logs/hive3.1.3.log 2>&1 &
beeline 连接验证
beeline
!connect jdbc:hive2://localhost:10000 root 1234
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。