赞
踩
安装包自行去官网下载
tar -zxvf scala-2.11.12.tgz
export SCALA_HOME=/opt/software/scala2.11
export PATH=
P
A
T
H
:
PATH:
PATH:SCALA_HOME/bin
scala
spark安装包自行下载
tar -axvf spark-2.4.7-bin-hadoop2.6.tgz
export JAVA_HOME=/root/software/jdk1.8.0_221
export SCALA_HOME=/opt/software/scala2.11
export SPARK_HOME=/opt/software/spark247
export HADOOP_HOME=/root/software/hadoop
export HADOOP_CONF_DIR=/root/software/hadoop/etc/hadoop
export SPARK_MASTER_IP=hadoop100
export SPARK_EXECUTOR_MEMORY=1G
export SPARK_HOME=/opt/software/spark247
export SPARK_CONF_DIR=
S
P
A
R
K
H
O
M
E
/
c
o
n
f
e
x
p
o
r
t
P
A
T
H
=
SPARK_HOME/conf export PATH=
SPARKHOME/confexportPATH=PATH:$SPARK_HOME/bin
spark-shell
如果是集群,修改 salves 类似hadoop完全分布的slaves,workers
加入集群的ip
还有spark-env.sh 的master_ip都写主节点,每台节点都要安装scala等
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。