当前位置:   article > 正文

Hadoop集群安装组件版本对应关系_spark与hadoop版本对应关系

spark与hadoop版本对应关系

hbase和Hadoop之间版本对应关系

图片来源参考官网:http://hbase.apache.org/book.html#hadoop
在这里插入图片描述

hive和hadoop、hive和spark之间版本对应关系

版本信息来自于hive源码包src.tar.gz的pom.xml:

hive-3.1.2
<hadoop.version>3.1.0</hadoop.version>
<hbase.version>2.0.0-alpha4</hbase.version>
<spark.version>2.3.0</spark.version>
<scala.binary.version>2.11</scala.binary.version>
<scala.version>2.11.8</scala.version>
<zookeeper.version>3.4.6</zookeeper.version>

hive-2.3.6
<hadoop.version>2.7.2</hadoop.version>
<hbase.version>1.1.1</hbase.version>
<spark.version>2.0.0</spark.version>
<scala.binary.version>2.11</scala.binary.version>
<scala.version>2.11.8</scala.version>
<zookeeper.version>3.4.6</zookeeper.version>

Hive Version Spark Version

3.0.x 2.3.0
2.3.x 2.0.0
2.2.x 1.6.0
2.1.x 1.6.0
2.0.x 1.5.0
1.2.x 1.3.1
1.1.x 1.2.0

apache-hive-1.2.2-src <spark.version>1.3.1</spark.version>
apache-hive-2.1.1-src <spark.version>1.6.0</spark.version>
apache-hive-2.3.3-src <spark.version>2.0.0</spark.version>
apache-hive-3.0.0-src <spark.version>2.3.0</spark.version>

stackoverflow上可行的例子是:

spark 2.0.2 with hadoop 2.7.3 and hive 2.1
参考链接:https://stackoverflow.com/questions/42281174/hive-2-1-1-on-spark-which-version-of-spark-should-i-use

qq群里有网友给出的版本是:

Hive 2.6 spark2.2.0

版本如下暂时没发现有什么兼容性问题:

apache-hive-3.0.0-bin
hadoop-3.0.3
spark-2.3.1-bin-hadoop2.7

参考博客链接:
https://blog.csdn.net/appleyuchi/article/details/81171785

flink

版本信息来自于flink源码包的pom.xml:

flink-1.9.1
<hadoop.version>2.4.1</hadoop.version>
<scala.version>2.11.12</scala.version>
<scala.binary.version>2.11</scala.binary.version>
<zookeeper.version>3.4.10</zookeeper.version>
<hive.version>2.3.4</hive.version>

参考:
https://blog.csdn.net/lucklydog123/article/details/109773865

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/从前慢现在也慢/article/detail/538151
推荐阅读
相关标签
  

闽ICP备14008679号