赞
踩
安装过程参考网上各种教程, 现在汇总下安装步骤内容。
:~$ uname -srmo
Linux 5.4.0-48-generic x86_64 GNU/Linux
:~$ java -version
openjdk version "1.8.0_265"
OpenJDK Runtime Environment (build 1.8.0_265-8u265-b01-0ubuntu2~20.04-b01)
OpenJDK 64-Bit Server VM (build 25.265-b01, mixed mode)
:~$ start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as *** in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
Starting datanodes
Starting secondary namenodes [******]
Starting resourcemanager
Starting nodemanagers
:~$ service mysql start
:~$ hive SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/apache-hive-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/apache-hive-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Hive Session ID = cdacc36f-3cf7-40de-833b-503f04b0db09 Logging initialized using configuration in file:/opt/apache-hive-bin/conf/hive-log4j2.properties Async: true Hive Session ID = b56e7d35-a955-456a-8d21-6712e92891ad Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive>
hive> use hive;
OK
Time taken: 0.997 seconds
hive> select * from env;
OK
1 hadoop hadoop-3.3.0.tar.gz 3.3.0 /opt/hadoop
2 hive apache-hive-3.1.2-bin.tar.gz 3.1.2 /opt/apache-hive-bin
3 mysql mysql-server 8.0.21 /usr/bin/mysql
Time taken: 1.983 seconds, Fetched: 3 row(s)
hive>
:~$ hdfs dfs -ls -R /user/hive/warehouse
drwxr-xr-x - *** supergroup 0 2020-10-08 11:37 /user/hive/warehouse/hive.db
drwxr-xr-x - *** supergroup 0 2020-10-08 11:51 /user/hive/warehouse/hive.db/env
-rw-r--r-- 1 *** supergroup 897 2020-10-08 11:51 /user/hive/warehouse/hive.db/env/000000_0
:~$ which hadoop
/opt/hadoop/bin/hadoop
:~$ which hive
/opt/apache-hive-bin/bin/hive
:~$ which mysql
/usr/bin/mysql
电脑(虚拟机): Ubuntu20.04.1 LTS, 已安装open-jdk(1.8)
Hadoop安装文件: hadoop-3.2.1.tar.gz 下载地址
Hive安装文件: apache-hive-3.1.2-bin.tar.gz 下载地址
mysql-connector-java文件: mysql-connector-java-8.0.21.jar 下载地址
:~$ sudo apt install openssh-server
:~$ ssh localhost
安装成功则会提示需要使用密码
输入密码后则会提示连接成功
***@localhost's password:
Welcome to Ubuntu 20.04.1 LTS (GNU/Linux 5.4.0-48-generic x86_64)
* Documentation: https://help.ubuntu.com
* Management: https://landscape.canonical.com
* Support: https://ubuntu.com/advantage
0 updates can be installed immediately.
0 of these updates are security updates.
Your Hardware Enablement Stack (HWE) is supported until April 2025.
Last login: Tue Oct 6 14:37:47 2020 from 127.0.0.1
:~$ logout
:~$ cd ./.ssh
:~/.ssh$ ls
id_rsa id_rsa.pub known_hosts
:~/.ssh$ cat ./id_rsa.pub >> ./authorized_keys
:~/.ssh$ ls
authorized_keys id_rsa id_rsa.pub known_hosts
Ubuntu安装openssh-server后会自动在用户文件目录下生成.ssh文件夹, 同时生成密钥
:~$ sudo tar -zxvf ./hadoop-3.2.1.tar.gz -C /opt
:~$ cd /opt
:/opt$ sudo mv ./hadoop-3.2.1 ./hadoop
:/opt$ sudo chgrp -R root ./hadoop
:/opt$ sudo chown -R root ./hadoop
:/opt$ sudo chmod -R 755 ./hadoop
:/opt$ ls -al | grep 'hadoop'
drwxr-xr-x 9 root root 4096 9月 11 2019 hadoop
:/opt$ cd
:~$ vim ./.bashrc
在vim模式下增加HADOOP_HOME, HADOOP_INSTALL, HADOOP_MAPRED_HOME, HADOOP_COMMON_HOME, HADOOP_HDFS_HOME, YARN_HOME, PATH 和HADOOP_CONF_DIR配置
如:
export HADOOP_HOME=/opt/hadoop
export HADOOP_INSTALL=${HADOOP_HOME}
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。