赞
踩
Table of Contents
这个问题(Exception in thread "main" org.apache.spark.sql.AnalysisException: Table or view not found: emp2; line 1 pos 14 分割 Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table or view 'emp2' not found in database 'default';)困扰挺久了,之前(去年)是学长帮忙配环境,这回自己捣鼓遇到许多问题。很多博客都在瞎扯淡,你抄我的、我抄你的,根本解决不了问题。问题、原因与解决方法的详细情况都写在下面了。
Ubuntu 16.04 LTS。
①编译器:IDEA 191.7479.19;
②Hadoop:Hadoop 2.6.0;
③Spark:Spark-2.4.3-bin-2.6.0-cdh5.9.3;
④hive:apache-hive-3.1.1-bin;
⑤maven:apache-maven-3.6.1;
⑥MySQL:MySQL 5.7.26;
⑦连接器:mysql-connector-java-5.1.28。
在终端可以通过hive访问数据库中的数据,如图1所示;但当通过IDEA编程访问时则报错,如图2所示。
详细提示信息如下所示:
①show table:
- log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
- log4j:WARN Please initialize the log4j system properly.
- log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
- Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
- 19/07/26 11:03:13 INFO SparkContext: Running Spark version 2.4.3
- 19/07/26 11:03:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 19/07/26 11:03:14 INFO SparkContext: Submitted application: TestHive
- 19/07/26 11:03:14 INFO SecurityManager: Changing view acls to: hadoop001
- 19/07/26 11:03:14 INFO SecurityManager: Changing modify acls to: hadoop001
- 19/07/26 11:03:14 INFO SecurityManager: Changing view acls groups to:
- 19/07/26 11:03:14 INFO SecurityManager: Changing modify acls groups to:
- 19/07/26 11:03:14 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop001); groups with view permissions: Set(); users with modify permissions: Set(hadoop001); groups with modify permissions: Set()
- 19/07/26 11:03:14 INFO Utils: Successfully started service 'sparkDriver' on port 33193.
- 19/07/26 11:03:14 INFO SparkEnv: Registering MapOutputTracker
- 19/07/26 11:03:14 INFO SparkEnv: Registering BlockManagerMaster
- 19/07/26 11:03:14 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
- 19/07/26 11:03:14 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
- 19/07/26 11:03:14 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-ec9ca4f5-2f81-4bc6-be79-0d46c2dc94da
- 19/07/26 11:03:14 INFO MemoryStore: MemoryStore started with capacity 1940.7 MB
- 19/07/26 11:03:14 INFO SparkEnv: Registering OutputCommitCoordinator
- 19/07/26 11:03:14 INFO Utils: Successfully started service 'SparkUI' on port 4040.
- 19/07/26 11:03:14 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://hadoop001:4040
- 19/07/26 11:03:14 INFO Executor: Starting executor ID driver on host localhost
- 19/07/26 11:03:14 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45071.
- 19/07/26 11:03:14 INFO NettyBlockTransferService: Server created on hadoop001:45071
- 19/07/26 11:03:14 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
- 19/07/26 11:03:14 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, hadoop001, 45071, None)
- 19/07/26 11:03:14 INFO BlockManagerMasterEndpoint: Registering block manager hadoop001:45071 with 1940.7 MB RAM, BlockManagerId(driver, hadoop001, 45071, None)
- 19/07/26 11:03:14 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, hadoop001, 45071, None)
- 19/07/26 11:03:14 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, hadoop001, 45071, None)
- 19/07/26 11:03:14 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/home/hadoop001/Documents/code/Scala/preprocessingData/spark-warehouse').
- 19/07/26 11:03:14 INFO SharedState: Warehouse path is 'file:/home/hadoop001/Documents/code/Scala/preprocessingData/spark-warehouse'.
- 19/07/26 11:03:15 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
- 19/0

Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。