赞
踩
数据类型转换这个在任何语言框架中都会涉及到,看起来非常简单,不过要把所有的数据类型都掌握还是需要一定的时间历练的
sparksql 数据类型 | scala数据类型 |
---|---|
ByteType | Byte |
ShortType | Short |
IntegerType | Int |
LongType | Long |
FloatType | Float |
DoubleType | Double |
DecimalType | scala.math.BigDecimal |
StringType | String |
BinaryType | Array[Byte] |
BooleanType | Boolean |
TimestampType | java.sql.Timestamp |
DateType | java.sql.Date |
ArrayType | scala.collection.Seq |
MapType | scala.collection.Map |
StructType | org.apache.spark.sql.Row |
StructField | The value type in Scala of the data type of this field (For example, Int for a StructField with the data type IntegerType) |
一句话描述:调用Column类的cast方法
这个之前写过
df("columnName") // On a specific `df` DataFrame.
col("columnName") // A generic column not yet associated with a DataFrame.
col("columnName.field") // Extracting a struct field
col("`a.column.with.dots`") // Escape `.` in column names.
$"columnName" // Scala short hand for a named column.
1,tom,23
2,jack,24
3,lily,18
4,lucy,19
val spark = SparkSession
.builder()
.appName("test")
.master("local[*]")
.getOrCreate()
spark.read.
textFile("./data/user")
.map(_.split(","))
.map(x => (x(0), x(1), x(2)))
.toDF("id", "name", "age")
.dtypes
.foreach(println)
结果:
(id,StringType)
(name,StringType)
(age,StringType)
说明默认都是StringType类型
import spark.implicits._
spark.read.
textFile("./data/user")
.map(_.split(","))
.map(x => (x(0), x(1), x(2)))
.toDF("id", "name", "age")
.select($"id".cast("int"), $"name", $"age".cast("int"))
.dtypes
.foreach(println)
结果:
(id,IntegerType)
(name,StringType)
(age,IntegerType)
// Casts colA to integer.
df.select(df("colA").cast("int"))
Since
1.3.0
// Casts colA to IntegerType.
import org.apache.spark.sql.types.IntegerType
df.select(df("colA").cast(IntegerType))
// equivalent to
df.select(df("colA").cast("int"))
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。