赞
踩
将 spark 安装包解压并移动到 /usr/local/src 路径,并重命名目录。
tar -zxvf spark-2.1.1-bin-hadoop2.7.tgz
mv spark-2.1.1-bin-hadoop2.7 /usr/local/src
cd /usr/local/src
mv spark-2.1.1-bin-hadoop2.7 spark
vi /root/.bash_profile
在文末追加以下文字:
# set spark environment
export SPARK_HOME=/usr/local/src/spark
export PATH=$PATH:$SPARK_HOME/bin
让环境变量生效
source /root/.bash_profile
cd /usr/local/src/spark/conf/
cp spark-env.sh.template spark-env.sh
vi spark-env.sh
export JAVA_HOME=/usr/local/src/jdk1.8.0_162
export HADOOP_CONF_DIR=/usr/local/src/hadoop-2.7.7/etc/hadoop
export SPARK_MASTER_HOST=master
export SPARK_MASTER_PORT=7077
export SPARK_WORKER_MEMORY=4g
export SPARK_WORKER_CORES=3
export SPARK_WORKER_INSTANCES=1
export SPARK_EXECUTOR_MEMORY=4g
export SPARK_EXECUTOR_CORES=3
根据自己的软件安装目录设置路径。
cp slaves.template slaves
vi slaves
master
slave1
slave2
cp spark-defaults.conf.template spark-defaults.conf
vi spark-defaults.conf
spark.master spark://master:7077
spark.eventLog.enabled true
spark.eventLog.dir hdfs://master:8021/directory
scp -r /usr/local/src/spark root@slave1:/usr/local/src/
scp -r /usr/local/src/spark root@slave2:/usr/local/src/
scp /root/.bash_profile root@slave1:/root/
scp /root/.bash_profile root@slave2:/root/
在 slave1 和 slave2 分别修改 /conf/spark-env.sh 文件。
ssh root@slave1
cd /usr/local/src/spark/conf/
vi spark-env.sh
添加:
export SPARK_LOCAL_IP=192.168.1.102 # 本节点的IP,在哪个节点就填对应节点的IP。
# export SPARK_LOCAL_IP=192.168.1.103
cd /usr/local/src/spark/sbin/
./start-all.sh
./start-history-server.sh hdfs://master:8021/directory
cd /usr/local/src/spark/sbin/
./stop-all.sh
./stop-history-server.sh hdfs://master:8021/directory
systemctl stop firewalld
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。