当前位置:   article > 正文

Ubuntu 20.04.1 LTS安装Hadoop3.3.0和hive3.1.2_welcome to ubuntu 20.04.1 lts (gnu/linux 5.4.0-155

welcome to ubuntu 20.04.1 lts (gnu/linux 5.4.0-155-generic x86_64) * documen

Ubuntu 20.04.1 LTS安装Hadoop3.3.0和hive3.1.2

安装过程参考网上各种教程, 现在汇总下安装步骤内容。

先上本机运行情况

  • 查看电脑环境
:~$ uname -srmo
Linux 5.4.0-48-generic x86_64 GNU/Linux
  • 1
  • 2
  • 查看java环境
:~$ java -version
openjdk version "1.8.0_265"
OpenJDK Runtime Environment (build 1.8.0_265-8u265-b01-0ubuntu2~20.04-b01)
OpenJDK 64-Bit Server VM (build 25.265-b01, mixed mode)
  • 1
  • 2
  • 3
  • 4
  • 启动Hadoop
:~$ start-all.sh 
WARNING: Attempting to start all Apache Hadoop daemons as *** in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
Starting datanodes
Starting secondary namenodes [******]
Starting resourcemanager
Starting nodemanagers
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 启动mysql
:~$ service mysql start
  • 1
  • 启动hive
:~$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/apache-hive-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/apache-hive-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = cdacc36f-3cf7-40de-833b-503f04b0db09

Logging initialized using configuration in file:/opt/apache-hive-bin/conf/hive-log4j2.properties Async: true
Hive Session ID = b56e7d35-a955-456a-8d21-6712e92891ad
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive>
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 执行hive查询
hive> use hive;
OK
Time taken: 0.997 seconds
hive> select * from env;
OK
1	hadoop	hadoop-3.3.0.tar.gz	3.3.0	/opt/hadoop
2	hive	apache-hive-3.1.2-bin.tar.gz	3.1.2	/opt/apache-hive-bin
3	mysql	mysql-server	8.0.21	/usr/bin/mysql
Time taken: 1.983 seconds, Fetched: 3 row(s)
hive> 
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 查看那hdfs系统hive表路径文件内容
:~$ hdfs dfs -ls -R /user/hive/warehouse
drwxr-xr-x   - *** supergroup          0 2020-10-08 11:37 /user/hive/warehouse/hive.db
drwxr-xr-x   - *** supergroup          0 2020-10-08 11:51 /user/hive/warehouse/hive.db/env
-rw-r--r--   1 *** supergroup        897 2020-10-08 11:51 /user/hive/warehouse/hive.db/env/000000_0
  • 1
  • 2
  • 3
  • 4
  • 分别查看Hadoop, hive, mysql安装路径
:~$ which hadoop
/opt/hadoop/bin/hadoop
:~$ which hive
/opt/apache-hive-bin/bin/hive
:~$ which mysql
/usr/bin/mysql
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

准备工作

  • 设备&电脑

电脑(虚拟机): Ubuntu20.04.1 LTS, 已安装open-jdk(1.8)

  • 安装包&执行文件

Hadoop安装文件: hadoop-3.2.1.tar.gz 下载地址
Hive安装文件: apache-hive-3.1.2-bin.tar.gz 下载地址
mysql-connector-java文件: mysql-connector-java-8.0.21.jar 下载地址

软件准备

  • 安装ssh-server
:~$ sudo apt install openssh-server
  • 1
  • 检查ssh是否安装成功
:~$ ssh localhost
  • 1

安装成功则会提示需要使用密码
输入密码后则会提示连接成功

***@localhost's password: 
Welcome to Ubuntu 20.04.1 LTS (GNU/Linux 5.4.0-48-generic x86_64)

 * Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/advantage

0 updates can be installed immediately.
0 of these updates are security updates.

Your Hardware Enablement Stack (HWE) is supported until April 2025.
Last login: Tue Oct  6 14:37:47 2020 from 127.0.0.1
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 退出ssh连接
:~$ logout
  • 1
  • 配置ssh免密登陆(生成文件authorized_keys)
:~$ cd ./.ssh
:~/.ssh$ ls
id_rsa  id_rsa.pub  known_hosts
:~/.ssh$ cat ./id_rsa.pub >> ./authorized_keys
:~/.ssh$ ls
authorized_keys  id_rsa  id_rsa.pub  known_hosts
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

Ubuntu安装openssh-server后会自动在用户文件目录下生成.ssh文件夹, 同时生成密钥

安装Hadoop

解压Hadoop文件

:~$ sudo tar -zxvf ./hadoop-3.2.1.tar.gz -C /opt
:~$ cd /opt
:/opt$ sudo mv ./hadoop-3.2.1 ./hadoop
:/opt$ sudo chgrp -R root ./hadoop
:/opt$ sudo chown -R root ./hadoop
:/opt$ sudo chmod -R 755 ./hadoop
:/opt$ ls -al | grep 'hadoop'
drwxr-xr-x  9 root root  4096 9月  11  2019 hadoop
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8

配置Hadoop环境变量

  • 增加Hadoop环境变量
:/opt$ cd
:~$ vim ./.bashrc
  • 1
  • 2

在vim模式下增加HADOOP_HOME, HADOOP_INSTALL, HADOOP_MAPRED_HOME, HADOOP_COMMON_HOME, HADOOP_HDFS_HOME, YARN_HOME, PATH 和HADOOP_CONF_DIR配置
如:

export HADOOP_HOME=/opt/hadoop
export HADOOP_INSTALL=${HADOOP_HOME}
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME
  • 1
  • 2
  • 3
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/你好赵伟/article/detail/507864
推荐阅读
相关标签
  

闽ICP备14008679号