赞
踩
提交spark任务时一直报错:No suitable driver
提交任务命令如下
- spark-submit \
- --master yarn-client \
- --num-executors 1 \
- --executor-cores 2 \
- --executor-memory 4G \
- --conf spark.sql.shuffle.partitions=4 \
- --class com.dy.ads.ads_rk_ccxx_population_property ../../script/smart-city-1.0.jar
每次执行这个命令都会报错,经过查询资料,尝试了加入--jars /opt/sparkJob/mysql-connector-java-5.1.47.jar也不行,后来又看到相关资料,还需要加入--conf spark.executor.extraClassPath=/opt/sparkJob/mysql-connector-java-5.1.47.jar才可以
完整命令如下
- spark-submit \
- --jars ../../jars/mysql-connector-java-5.1.37.jar \
- --driver-class-path ../../jars/mysql-connector-java-5.1.37.jar \
- --conf spark.executor.extraClassPath=/opt/sparkJob/mysql-connector-java-5.1.37.jar \
- --master yarn-client \
- --num-executors 1 \
- --executor-cores 2 \
- --executor-memory 4G \
- --conf spark.sql.shuffle.partitions=4 \
- --class com.dy.ads.ads_rk_ccxx_population_property ../../script/smart-city-1.0.jar
这三个必须都加上才可以,即--jars
、--driver-class-path
和--conf
。
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。