赞
踩
链接:https://pan.baidu.com/s/1ivSvMe52KOuGtAmGPB8U0A
密码:jvx8
vim /etc/profile
加入
export JAVA_HOME=/home/bigdata/jdk
export HADOOP_HOME=/home/bigdata/hadoop
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:
修改host文件
vim /etc/hosts
Ip地址 主机名
Ip地址 从机1
Ip地址 从机2
ssh-keygen -t rsa
ssh-copy-id 主机名
ssh-copy-id 主机名2
ssh-copy-id 主机名3
用 ssh+虚拟机名来验证是否成功
如果主机名改变了就成功了
找到目录/home/bigdata/hadoop/etc/hdoop
配置文件
文件一:hadoop-env.sh export JAVA_HOME=/home/bigdata/jdk 文件二:core-site.xml <property> <name>fs.defaultFS</name> <value>hdfs://主机:8020</value> </property> <property> <name>io.file.buffer.size</name> <value>4096</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/home/bigdata/tmp</value> </property> 文件三:hdfs-site.xml <property> <name>dfs.replication</name> <value>3</value> </property> <property> <name>dfs.block.size</name> <value>134217728</value> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:///home/hadoopdata/dfs/name</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:///home/hadoopdata/dfs/data</value> </property> <property> <name>fs.checkpoint.dir</name> <value>file:///home/hadoopdata/checkpoint/dfs/cname</value> </property> <property> <name>fs.checkpoint.edits.dir</name> <value>file:///home/hadoopdata/checkpoint/dfs/cname</value> </property> <property> <name>dfs.http.address</name> <value>主机:50070</value> </property> <property> <name>dfs.secondary.http.address</name> <value>从机1:50090</value> </property> <property> <name>dfs.webhdfs.enabled</name> <value>true</value> </property> <property> <name>dfs.permissions</name> <value>false</value> </property> 文件四:mapred-site.xml(提示 mv mapred-site.xml.template mapred-site.xml ) <property> <name>mapreduce.framework.name</name> <value>yarn</value> <final>true</final> </property> <property> <name>mapreduce.jobhistory.address</name> <value>主机:10020</value> </property> <property> <name>mapreduce.jobhistory.webapp.address</name> <value>主机:19888</value> </property> 文件五:yarn-site.xml <property> <name>yarn.resourcemanager.hostname</name> <value>主机</value> </property> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property> <property> <name>yarn.resourcemanager.address</name> <value>主机:8032</value> </property> <property> <name>yarn.resourcemanager.scheduler.address</name> <value>主机:8030</value> </property> <property> <name>yarn.resourcemanager.resource-tracker.address</name> <value>主机:8031</value> </property> <property> <name>yarn.resourcemanager.admin.address</name> <value>主机:8033</value> </property> <property> <name>yarn.resourcemanager.webapp.address</name> <value>主机:8088</value> </property> 文件六:slaves 主机 从机1 从机2
配置完成后发给其他虚拟机一份
scp -r /home/bigdata/hadoop 从机1:/home/bigdata/
scp -r /home/bigdata/jdk 从机1:/home/bigdata/
scp -r /home/bigdata/hadoop 从机2:/home/bigdata/
scp -r /home/bigdata/jdk 从机2:/home/bigdata/
hadoop namenode -format
Start-all.sh
关闭stop-all.sh
jps
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。