赞
踩
tar -zxvf spark-2.3.0-bin-hadoop2.7.tgz
#Spark enviroment
- export SPARK_HOME=/opt/spark/spark-2.3.0-bin-hadoop2.7/
- export PATH="$SPARK_HOME/bin:$PATH"
mkdir spark_file_test
touch hello_spark
- vi hello_spark
- hello spark!
- hello spark!
- hello spark!
- hello spark!
- 2018-04-30 09:35:53 WARN Utils:66 - Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.159.128 instead (on interface eth0)
- 2018-04-30 09:35:53 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
- 2018-04-30 09:35:57 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- Setting default log level to "WARN".Spark context Web UI available at http://192.168.159.128:4040
- Spark context available as 'sc' (master = local[*], app id = local-1524847005612).
- Spark session available as 'spark'.
- Welcome to
- ____ __
- / __/__ ___ _____/ /__
- _\ \/ _ \/ _ `/ __/ '_/
- /___/ .__/\_,_/_/ /_/\_\ version 2.3.0
- /_/
-
- Using Scala version 2.11.8 (Java HotSpot(TM) Client VM, Java 1.8.0_171)
- Type in expressions to have them evaluated.
- Type :help for more information
- scala> var lines = sc.textFile("../../spark_file_test/hello_spark")
- 2018-04-27 09:40:53 WARN SizeEstimator:66 - Failed to check whether UseCompressedOops is set; assuming yes
- lines: org.apache.spark.rdd.RDD[String] = ../../spark_file_test/hello_spark MapPartitionsRDD[1] at textFile at <console>:24
- scala> lines.count()
- res0: Long = 5
-
- scala> lines.first
- res1: String = Hello Spark!
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。