赞
踩
提交spark任务时一直报错:No suitable driver
提交任务命令如下
spark-submit \
--driver-class-path /opt/sparkJob/mysql-connector-java-5.1.47.jar \
--master yarn \
--deploy-mode cluster \
--driver-memory 4G \
--executor-memory 2G \
--total-executor-cores 12 \
--class com.xxx.xxx.Demo /opt/sparkJob/xxx.jar
每次执行这个命令都会报错,经过查询资料,尝试了加入--jars /opt/sparkJob/mysql-connector-java-5.1.47.jar
也不行,后来又看到相关资料,还需要加入--conf spark.executor.extraClassPath=/opt/sparkJob/mysql-connector-java-5.1.47.jar
才可以
完整命令如下
spark-submit \
--jars /opt/sparkJob/mysql-connector-java-5.1.47.jar \
--driver-class-path /opt/sparkJob/mysql-connector-java-5.1.47.jar \
--conf spark.executor.extraClassPath=/opt/sparkJob/mysql-connector-java-5.1.47.jar \
--master yarn \
--deploy-mode cluster \
--driver-memory 4G \
--executor-memory 2G \
--total-executor-cores 12 \
--class com.xxx.xxx.Demo /opt/sparkJob/xxx.jar
这三个必须都加上才可以,即--jars
、--driver-class-path
和--conf
。
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。