当前位置:   article > 正文

Hbase与Hive整合_hbase整合hive

hbase整合hive

Hbase版本0.96.2
Hive版本0.13
在整合的时候需要将Hbase的部分以Hbase开头的包导入到Hive的lib中,并且,尤其注意一个htrace-core-2.04.jar这个jar包,没有的话会出错。

Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:3                                                                   9)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImp                                                                   l.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java                                                                   :387)
        ... 34 more
Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
        at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:19                                                                   5)
        at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
        at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
        at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveCluste                                                                   rId(HConnectionManager.java:806)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnec                                                                   tionManager.java:633)
        ... 39 more
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
        ... 45 more
)
        at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:88                                                                   )
        at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:1                                                                   62)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:                                                                   554)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:                                                                   547)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.ja                                                                   va:89)
        at $Proxy11.createTable(Unknown Source)
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
        at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55

配置文件

hive-site.xml

<property>
          <name>hive.aux.jars.path</name>
          <value>file:///usr/local/apache-hive-0.13.0-bin/lib/hive-hbase-handler-0.13.0.jar,file:///usr/local/apache-hive-0.13.0-bin/lib/hbase-client-0.96.2-hadoop2.jar,file:///usr/local/apache-hive-0.13.0-bin/lib/hbase-common-0.96.2-hadoop2.jar,file:///usr/local/apache-hive-0.13.0-bin/lib/htrace-core-2.04.jar,file:///usr/local/apache-hive-0.13.0-bin/lib/guava-11.0.2.jar,file:///usr/local/apache-hive-0.13.0-bin/lib/zookeeper-3.4.5.jar</value>
        </property>
  • 1
  • 2
  • 3
  • 4
  • 5

Hive中建表语句

key部分是行键, col1是列族,value是列。

hive> create table hb_test(key int, value string)
    > stored by "org.apache.hadoop.hive.hbase.HBaseStorageHandler"
    > with serdeproperties("hbase.columns.mapping"=":key,col1:value")
    > tblproperties("hbase.table.name"="hb_test");
  • 1
  • 2
  • 3
  • 4
  • 5

Hbase插入数据

hbase(main):066:0> put "hb_test", 2, "col1:value", "wy"
0 row(s) in 0.0090 seconds

hbase(main):067:0> put "hb_test", 3, "col1:value", "w"
0 row(s) in 0.0080 seconds

hbase(main):068:0> put "hb_test", 4, "col1:value", "x"
0 row(s) in 0.0080 seconds

hbase(main):069:0> put "hb_test", 5, "col1:value", "y"
0 row(s) in 0.0070 seconds

hbase(main):070:0> put "hb_test", 11, "col1:value", "ww"
0 row(s) in 0.0080 seconds

hbase(main):071:0> put "hb_test", 22, "col1:value", "yy"
0 row(s) in 0.0100 seconds

hbase(main):072:0> put "hb_test", 33, "col1:value", "x"
0 row(s) in 0.0080 seconds

hbase(main):073:0> put "hb_test", 44, "col1:value", "xx"
0 row(s) in 0.0090 seconds

hbase(main):074:0> put "hb_test", 55, "col1:value", "xx"
0 row(s) in 0.0120 seconds
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27

Hive进行统计查询

hive> select * from hb_test where value = 'x';
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1482150790771_0001, Tracking URL = http://hadoopwy2:8088/proxy/application_14821507
Kill Command = /usr/local/hadoop2/bin/hadoop job  -kill job_1482150790771_0001
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2016-12-19 23:19:07,512 Stage-1 map = 0%,  reduce = 0%
2016-12-19 23:20:07,798 Stage-1 map = 0%,  reduce = 0%
2016-12-19 23:20:39,077 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 2.96 sec
MapReduce Total cumulative CPU time: 2 seconds 960 msec
Ended Job = job_1482150790771_0001
MapReduce Jobs Launched:
Job 0: Map: 1   Cumulative CPU: 2.96 sec   HDFS Read: 261 HDFS Write: 9 SUCCESS
Total MapReduce CPU Time Spent: 2 seconds 960 msec
OK
33      x
4       x
Time taken: 205.946 seconds, Fetched: 2 row(s)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20

转载注明出处:http://blog.csdn.net/wangyang1354/article/details/53763627

声明:本文内容由网友自发贡献,转载请注明出处:【wpsshop博客】
推荐阅读
相关标签
  

闽ICP备14008679号