当前位置:   article > 正文

sparksql异常总结_org.apache.hadoop.hive.serde2.lazy.lazysimpleserde

org.apache.hadoop.hive.serde2.lazy.lazysimpleserde

一、Caused by: org.apache.hadoop.hive.serde2.SerDeException: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: columns has 4 elements while columns.types has 3 elements!

SQL:

  1. insert overwrite directory 'xxx'
  2. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001'
  3. select 
  4. id ,
  5. contract_id , 
  6. date_format(update_time,'yyyy-MM-dd hh:mm:ss'
  7. from (

异常解决:

date_format(update_time,'yyyy-MM-dd hh:mm:ss')  这种字符串作为的列需要起一个别名

  1. insert overwrite directory 'xxx'
  2. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001'
  3. select 
  4. id ,
  5. contract_id , 
  6. date_format(update_time,'yyyy-MM-dd hh:mm:ss'as update_time
  7. from (

改成这种之后错误异常解决!!!

--------------------------------------------------------------------------------------------------------------------------------

spark-version:2.3 会出现这个问题

出现问题部分的源码

  1. public void extractColumnInfo() throws SerDeException {
  2. String columnNameProperty = this.tableProperties.getProperty("columns");
  3. String columnTypeProperty = this.tableProperties.getProperty("columns.types");
  4. if (columnNameProperty != null && columnNameProperty.length() > 0) {
  5. //因为这边根据“,"进行分割,所以如果表达式中包含“,”,且没有别名的话,size就会多一个
  6. this.columnNames = Arrays.asList(columnNameProperty.split(","));
  7. } else {
  8. this.columnNames = new ArrayList();
  9. }
  10. if (columnTypeProperty == null) {
  11. StringBuilder sb = new StringBuilder();
  12. for(int i = 0; i < this.columnNames.size(); ++i) {
  13. if (i > 0) {
  14. sb.append(":");
  15. }
  16. sb.append("string");
  17. }
  18. columnTypeProperty = sb.toString();
  19. }
  20. this.columnTypes = TypeInfoUtils.getTypeInfosFromTypeString(columnTypeProperty);
  21. if (this.columnNames.size() != this.columnTypes.size()) {
  22. //error部分
  23. throw new SerDeException(this.serdeName + ": columns has " + this.columnNames.size() + " elements while columns.types has " + this.columnTypes.size() + " elements!");
  24. }
  25. }

进一步验证,判断是否是“,“分割引起的,将sql调整为如下

  1. insert overwrite directory 'xxx'
  2. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001'
  3. select 
  4. id ,
  5. contract_id , 
  6. --两个,号
  7. substr(update_time,1,10
  8. from (

结果报错:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe: columns has 5 elements while columns.types has 3 elements!

 验证是这个原因导致,起别名会将columeName改成别名,就不会存在分割size问题!!!

spark-version:3.1 不出现这个问题

  1. /**
  2. * Extracts and set column names and column types from the table properties
  3. * @throws SerDeException
  4. */
  5. public void extractColumnInfo() throws SerDeException {
  6. // Read the configuration parameters
  7. String columnNameProperty = tableProperties.getProperty(serdeConstants.LIST_COLUMNS);
  8. // NOTE: if "columns.types" is missing, all columns will be of String type
  9. String columnTypeProperty = tableProperties.getProperty(serdeConstants.LIST_COLUMN_TYPES);
  10. // Parse the configuration parameters
  11. String columnNameDelimiter = tableProperties.containsKey(serdeConstants.COLUMN_NAME_DELIMITER) ? tableProperties
  12. .getProperty(serdeConstants.COLUMN_NAME_DELIMITER) : String.valueOf(SerDeUtils.COMMA);
  13. if (columnNameProperty != null && columnNameProperty.length() > 0) {
  14. columnNames = Arrays.asList(columnNameProperty.split(columnNameDelimiter));
  15. } else {
  16. columnNames = new ArrayList<String>();
  17. }
  18. if (columnTypeProperty == null) {
  19. // Default type: all string
  20. StringBuilder sb = new StringBuilder();
  21. for (int i = 0; i < columnNames.size(); i++) {
  22. if (i > 0) {
  23. sb.append(":");
  24. }
  25. sb.append(serdeConstants.STRING_TYPE_NAME);
  26. }
  27. columnTypeProperty = sb.toString();
  28. }
  29. columnTypes = TypeInfoUtils.getTypeInfosFromTypeString(columnTypeProperty);
  30. if (columnNames.size() != columnTypes.size()) {
  31. throw new SerDeException(serdeName + ": columns has " + columnNames.size()
  32. + " elements while columns.types has " + columnTypes.size() + " elements!");
  33. }
  34. }

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/AllinToyou/article/detail/389366
推荐阅读
相关标签
  

闽ICP备14008679号