(SparkContext.scala:367) at org.apache.spark.S..._exception in thread "main" org.apache.spark.sparkexception:">
赞
踩
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:367)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:933)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:924)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:924)
at com.rone.DBUtils.MySQLOperations$.main(MySQLOperations.scala:14)
at com.rone.DBUtils.MySQLOperations.main(MySQLOperations.scala)
报这个错误是因为本地跑spark没有设置master为local
错误示例
val spark = SparkSession.builder().appName("ReadMySQL").getOrCreate()
当时的情况是因为我把项目打成了jar放到了hdfs上,利用Livy运行在Yarn上(这里不能指定master为local)
所以当时就把master注释掉了,只需将其放开就好啦
如下:
val spark = SparkSession.builder().appName("ReadMySQL").master("local").getOrCreate()
Best Wishes
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。