当前位置:   article > 正文

踩坑记录之spark:value toDF is not a member of org.apache.spark.rdd.RDD_error:(26, 7) value todf is not a member of seq[(s

error:(26, 7) value todf is not a member of seq[(string, string, string, int

用 sbt 打包 scala 程序时,遇到如下错误:

[error] /home/hadoop/sparkapp/src/main/scala/RecommendApp.scala:25:144: value toDF is not a member of org.apache.spark.rdd.RDD[Movie]
[error] val moviesDF = moviesRDD.map(x => Movie (x.split("::")(0).toInt,x.split("::")(1).replaceAll("[0-9()]","").trim,x.split("::")(2).trim)).toDF
[error]                                                ^
[error] /home/hadoop/sparkapp/src/main/scala/SimpleApp.scala:33:61: value toDF is not a member of org.apache.spark.rdd.RDD[Rating]
[error] val ratingsDF = ratingsRDD.map(parseRating).toDF
[error]                                                                                   ^

经授课老师指点,解决上述错误的过程中,有两个地方需要我们注意:

         1、启用隐式转换时,需要在 main 函数中自行创建 SparkSession 对象,然后使用该对象来启用隐式转换,而非在 object 对象之前启用。

         2、case class 类的声明需放在 main 函数之前。

  1. import org.apache.spark.sql.SparkSession
  2. # import spark.implicits._ 错误示范
  3. object RecommendApp {
  4. case class Rating(user_id:Int, movie_id:Int, rating:Float, timestamp:Long)       
  5. case class Movie(movie_id:Int, movie_name:String, movie_type:String)   
  6. def main(args: Array[String]) {
  7. val conf = new SparkConf().setAppName("Simple Application")
  8. val sc = new SparkContext(conf)
  9. val spark=  SparkSession.builder().getOrCreate()
  10. import spark.implicits._
  11. # 错误示范
  12. # case class Rating(user_id:Int, movie_id:Int, rating:Float, timestamp:Long)   
  13. # case class Movie(movie_id:Int, movie_name:String, movie_type:String) 

 

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Guff_9hys/article/detail/899630
推荐阅读
相关标签
  

闽ICP备14008679号