赞
踩
[root@cxy opt]# tar -zxf spark-3.2.1-bin-hadoop2.7.tgz -C /usr/local/
[root@cxy opt]# cd /usr/local/spark-3.2.1-bin-hadoop2.7/conf/
[root@cxy conf]# cp spark-env.sh.template spark-env.sh
export JAVA_HOME=/opt/jdk JAVA 的安装路径
export HADOOP_HOME=/opt/hadoop Hadoop 的安装路径
export HADOOP_CONF_DIR=/opt/hadoop/etc/hadoop Hadoop 配置文件的路径
export SPARK_MASTER_IP=192.168.130.10 Spark主节点的IP 地址或主机名
export SPARK_LOCAL_IP=192.168.130.10 Spark本地的IP 地址或主机名
[root@cxy conf]# cd /usr/local/spark-3.2.1-bin-hadoop2.7/sbin/
[root@cxy sbin]# ./start-all.sh
[root@cxy sbin]# cd /usr/local/spark-3.2.1-bin-hadoop2.7/
[root@cxy spark-3.2.1-bin-hadoop2.7]# ./bin/spark-shell
Spark-shell 启动成功后,访问http://192.168.130.10:8080,可在Spark监控界面看到对应的Spark应用程序的相关信息
[root@cxy opt]# tar -zxf scala-2.11.8.tgz -C /usr/local/
export SCALA_HOME=/usr/local/scala-2.11.8
export PATH=$PATH:$SCALA_HOME/bin
[root@cxy opt]# source /etc/profile
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。