当前位置:   article > 正文

spark-submit提交任务报错:java.sql.SQLException:No suitable driver_spark连接mysql在idea没问题但是提交到hdfs上就报错no suitable drive

spark连接mysql在idea没问题但是提交到hdfs上就报错no suitable driver

提交spark任务时一直报错:No suitable driver
提交任务命令如下

spark-submit \
--driver-class-path /opt/sparkJob/mysql-connector-java-5.1.47.jar \
--master yarn \
--deploy-mode cluster \
--driver-memory 4G \
--executor-memory 2G \
--total-executor-cores 12 \
--class com.xxx.xxx.Demo /opt/sparkJob/xxx.jar
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8

每次执行这个命令都会报错,经过查询资料,尝试了加入--jars /opt/sparkJob/mysql-connector-java-5.1.47.jar也不行,后来又看到相关资料,还需要加入--conf spark.executor.extraClassPath=/opt/sparkJob/mysql-connector-java-5.1.47.jar才可以
完整命令如下

spark-submit \
--jars /opt/sparkJob/mysql-connector-java-5.1.47.jar \
--driver-class-path /opt/sparkJob/mysql-connector-java-5.1.47.jar \
--conf spark.executor.extraClassPath=/opt/sparkJob/mysql-connector-java-5.1.47.jar \
--master yarn \
--deploy-mode cluster \
--driver-memory 4G \
--executor-memory 2G \
--total-executor-cores 12 \
--class com.xxx.xxx.Demo /opt/sparkJob/xxx.jar
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10

这三个必须都加上才可以,即--jars--driver-class-path--conf

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/繁依Fanyi0/article/detail/593381
推荐阅读
相关标签
  

闽ICP备14008679号