赞
踩
用 sbt 打包 scala 程序时,遇到如下错误:
[error] /home/hadoop/sparkapp/src/main/scala/RecommendApp.scala:25:144: value toDF is not a member of org.apache.spark.rdd.RDD[Movie]
[error] val moviesDF = moviesRDD.map(x => Movie (x.split("::")(0).toInt,x.split("::")(1).replaceAll("[0-9()]","").trim,x.split("::")(2).trim)).toDF
[error] ^
[error] /home/hadoop/sparkapp/src/main/scala/SimpleApp.scala:33:61: value toDF is not a member of org.apache.spark.rdd.RDD[Rating]
[error] val ratingsDF = ratingsRDD.map(parseRating).toDF
[error] ^
经授课老师指点,解决上述错误的过程中,有两个地方需要我们注意:
1、启用隐式转换时,需要在 main 函数中自行创建 SparkSession 对象,然后使用该对象来启用隐式转换,而非在 object 对象之前启用。
2、case class 类的声明需要放在 main 函数之前。
- import org.apache.spark.sql.SparkSession
- # import spark.implicits._ 错误示范
- object RecommendApp {
- case class Rating(user_id:Int, movie_id:Int, rating:Float, timestamp:Long)
- case class Movie(movie_id:Int, movie_name:String, movie_type:String)
- def main(args: Array[String]) {
- val conf = new SparkConf().setAppName("Simple Application")
- val sc = new SparkContext(conf)
- val spark= SparkSession.builder().getOrCreate()
- import spark.implicits._
- # 错误示范
- # case class Rating(user_id:Int, movie_id:Int, rating:Float, timestamp:Long)
- # case class Movie(movie_id:Int, movie_name:String, movie_type:String)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。