赞
踩
在运行./spark-shell连hive的时候能够运行,但是运行./spark-sql的时候报错如下:
- Caused by: java.sql.SQLException: No suitable driver found for jdbc:mysql://hadoop000:3306/hadoop_hive?createDatabaseIfNotExist=true
- at java.sql.DriverManager.getConnection(DriverManager.java:689)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 66 more
可能是输入运行命令时没有指定--driver-class-path导致的。
具体说来,在运行./spark-shell的时候,命令是这样的:
[hadoop@hadoop000 bin]$ ./spark-shell --master local[2] --jars ~/software/mysql-connector-java-5.1.27-bin.jar
但在运行./spark-sql时,命令应该加上--driver-class-path,是这样的:
[hadoop@hadoop000 bin]$ ./spark-sql --master local[2] --jars ~/software/mysql-connector-java-5.1.27-bin.jar --driver-class-path ~/software/mysql-connector-java-5.1.27-bin.jar
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。