当前位置:   article > 正文

Hive SQL执行失败问题记录_error while processing statement: failed: executio

error while processing statement: failed: execution error, return code 2 fro

概述

Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

Hive数据源跑SQL失败报错信息:Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

网上查找文档,几乎都是说:namenode内存空间不够,JVM剩余内存空间不够新job运行所致。考虑到impala SQL有set mem_limit=4G写法,测试验证下,Hive也支持如下写法;

参考impala_mem_limit

Error while compiling statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, is running beyond the ‘PHYSICAL’ memory limit.

解决方案:

set tez.grouping.min-size=1G;
set tez.grouping.max-size=1G;
  • 1
  • 2

HiveSQLException: AnalysisException: Table already exists

具体的报错信息:

ERROR c.x.c.a.b.d.b.JdbcDataProvider - jdbc execAutoWork 资金排量缺口 error: {}
org.apache.hive.service.cli.HiveSQLException: AnalysisException: Table already exists: rpt.
	at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:266)
	at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:252)
	at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:318)
	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:259)
	at org.apache.hive.jdbc.HiveStatement.executeUpdate(HiveStatement.java:487)
	at com.alibaba.druid.pool.DruidPooledStatement.executeUpdate(DruidPooledStatement.java:336)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8

Error while compiling statement: FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.InvalidTableException: Table not found

Error while cleaning up the server resources

TTransportException SocketException Broken pipe

00:30:07.590 [Druid-ConnectionPool-Destroy-1262024801] ERROR com.alibaba.druid.util.JdbcUtils - close connection error
java.sql.SQLException: Error while cleaning up the server resources
	at org.apache.hive.jdbc.HiveConnection.close(HiveConnection.java:640)
	at com.alibaba.druid.util.JdbcUtils.close(JdbcUtils.java:85)
	at com.alibaba.druid.pool.DruidDataSource.shrink(DruidDataSource.java:3194)
	at com.alibaba.druid.pool.DruidDataSource$DestroyTask.run(DruidDataSource.java:2938)
	at com.alibaba.druid.pool.DruidDataSource$DestroyConnectionThread.run(DruidDataSource.java:2922)
Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe
	at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
	at org.apache.thrift.transport.TSaslTransport.flush(TSaslTransport.java:501)
	at org.apache.thrift.transport.TSaslClientTransport.flush(TSaslClientTransport.java:37)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
	at org.apache.hive.service.cli.thrift.TCLIService$Client.CloseSession(TCLIService.java:165)
	at org.apache.hive.jdbc.HiveConnection.close(HiveConnection.java:638)
Caused by: java.net.SocketException: Broken pipe
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:153)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21

statment语句执行一个sql时,在指定的时间内没有获取到结果,然后报超时异常后,紧接着去关闭connection 连接,而导致的异常。
一个可供参考的解决方案:

Statement stat = con.createStatement();
stat.setQueryTimeout(1800);
  • 1
  • 2

注意使用的hive-jdbc版本必须是2+。

但是代码改动上线之后,这个问题依旧存在。

TTransportException: null

基于上面的问题和解决方案,但是线上的任务执行一段时间后,再次出现另外一个报错:

com.alibaba.druid.util.JdbcUtils - close connection error
java.sql.SQLException: Error while cleaning up the server resources
	at org.apache.hive.jdbc.HiveConnection.close(HiveConnection.java:730)
	at com.alibaba.druid.util.JdbcUtils.close(JdbcUtils.java:85)
	at com.alibaba.druid.pool.DruidDataSource.shrink(DruidDataSource.java:3194)
	at com.alibaba.druid.pool.DruidDataSource$DestroyTask.run(DruidDataSource.java:2938)
	at com.alibaba.druid.pool.DruidDataSource$DestroyConnectionThread.run(DruidDataSource.java:2922)
Caused by: org.apache.thrift.transport.TTransportException: null
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
	at org.apache.thrift.transport.TSaslTransport.readLength(TSaslTransport.java:376)
	at org.apache.thrift.transport.TSaslTransport.readFrame(TSaslTransport.java:453)
	at org.apache.thrift.transport.TSaslTransport.read(TSaslTransport.java:435)
	at org.apache.thrift.transport.TSaslClientTransport.read(TSaslClientTransport.java:37)
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
	at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
	at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
	at org.apache.hive.jdbc.HiveConnection.close(HiveConnection.java:728)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20

SQLException: Invalid columnIndex: 2

报错信息如下

java.sql.SQLException: Invalid columnIndex: 2
	at org.apache.hive.jdbc.HiveBaseResultSet.getColumnValue(HiveBaseResultSet.java:418)
	at org.apache.hive.jdbc.HiveBaseResultSet.getObject(HiveBaseResultSet.java:463)
	at com.alibaba.druid.pool.DruidPooledResultSet.getObject(DruidPooledResultSet.java:438)
  • 1
  • 2
  • 3
  • 4
while (rs.next()) {
    for (int i = 1; i <= size; i++) {
        resultList.add(rs.getObject(i));
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
while (rs.next()) {
    for (int i = 1; i <= rs.getMetaData().getColumnCount(); i++) {
        resultList.add(rs.getObject(i));
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5

The query did not generate a result set!

报错信息如下:

java.sql.SQLException: The query did not generate a result set!
	at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:376)
	at com.alibaba.druid.pool.DruidPooledStatement.executeQuery(DruidPooledStatement.java:308)
  • 1
  • 2
  • 3

报错代码行:

rs = ps.executeQuery(item);
  • 1

分析:executeQuery()方法支持的是增删改查等DML语句,对于DDL需调用execute()来执行。

No such file or directory

报错信息如下:

java.sql.SQLException: Disk I/O error: Failed to open HDFS file hdfs://ppdhdpha/user/hive/warehouse/cszc.db/cs_phone_multi_call_pool_single/904924957a18ee1f-170b897c00000005_129981828_data.0.parq
Error(2): No such file or directory
Root cause: RemoteException: File does not exist: /user/hive/warehouse/cszc.db/cs_phone_multi_call_pool_single/904924957a18ee1f-170b897c00000005_129981828_data.0.parq
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:85)
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:75)
	at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java:152)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1909)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:735)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:415)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)
	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:279)
	at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:375)
	at com.alibaba.druid.pool.DruidPooledStatement.executeQuery(DruidPooledStatement.java:308)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21

Disk I/O error: Failed to open HDFS file

参考

Error while cleaning up the server resources
hive-query-cant-generate-result-set-via-jdbc
query-did-not-generate-a-resultset
Hive SQLException: Method not supported问题

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/AllinToyou/article/detail/66408
推荐阅读
相关标签
  

闽ICP备14008679号