赞
踩
下载hadoop地址:http://archive.apache.org/dist/hadoop/common/hadoop-3.2.2/
下载hadoop-winutils https://github.com/cdarlint/winutils
将红色方框里面的文件复制到hadoop-3.2.2\bin目录中,hadoop主要基于linux编写,这个winutil.exe主要用于模拟linux下的目录环境。因此hadoop放在windows下运行的时候,需要这个辅助程序才能运行。
配置环境变量:
新增系统变量 HADOOP_HONE
编辑系统变量中的Path
进入hadoop安装文件夹下的etc/hadoop文件夹,修改以下文件内容:
(1)core-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/G:/hadoop/tmp</value>
</property>
</configuration>
(2)hdfs-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
<!-- 这个参数设置为1,因为是单机版hadoop -->
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<!-- 需要先创建此目录 -->
<value>file:/G:/hadoop/data/dfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<!-- 需要先创建此目录 -->
<value>file:/G:/hadoop/data/dfs/datanode</value>
</property>
<property>
<name>dfs.http.address</name>
<value>0.0.0.0:50070</value>
</property>
<property>
<name>dfs.permissions</name>
<!-- 以便在网页中可以创建、上传文件 -->
<value>false</value>
</property>
</configuration>
(3)mapred-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
(4)yarn-site.xml
<?xml version="1.0"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
</configuration>
输入hadoop version
输入java version
当我们在Win10操作系统下安装hadoop时,输入hadoop version,我们可能会遇到以下这种问题:
出现这样的情况,是你的jdk是安装在C盘下,如果是非C盘,是一般不会出现这样的情况的。
这个的分情况,如果你的jdk是安装在C盘,比如保持默认路径
首先,我们找到D:\hadoop\hadoop-3.2.2\etc\hadoop这个目录下的hadoop-env.cmd这个命令脚本。(自己装在哪个目录下,就往哪个目录找)
右键,编辑 ,进入编辑页面
方法1、用路径替代符C:\PROGRA~1\Java\jdk1.8.0_66,
因为PROGRA~1是 C:\Program Files 目录的dos文件名模式下的缩写 。
长于8个字符的文件名和文件夹名,都被简化成前面6个有效字符,后面~1有重名的就 ~2,3
方法2、用引号括起来
set JAVA_HOME=“C:\Program Files”\Java\jdk1.8.0_66
不需修改hadoop-env.cmd。比如你安装在E:\software\jdk1.8
但是,如果你的jdk安装在D盘下的Program Files,使用”D:\Program Files"\Java\jdk1.7.0_03,但是,为什么还是不可以呢?
所以,非C盘不能加引号,并且路径不能是D:\Program Files,因为其中包含了空格,因此hadoop报错。
出现如下四个界面说明启动成功
查看启动是否成功,在控制台输入 jps 命令,得到进程id和进程名:
至此,hadoop安装成功!!!
参考:
https://blog.csdn.net/yjjhk/article/details/128331352
https://www.bbsmax.com/A/1O5EMo6G57/
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。