当前位置:   article > 正文

基于3.0.0-cdh6.3.2版本编译Flink1.14.4_cdh安装flink1.14

cdh安装flink1.14

一、背景

  • 异常描述

CDH-6.3.2环境下使用Flink-1.14.4的FlinkSQL的hive方言时出现如下异常

 java.lang.Runtimelxception: java,lang.IllegalArgumentException: Unrecoonized Hadoop major version number: 3.0.0-cdh6.2.1 

  • 问题说明

        开源社区hive 2.x的版本这种情况下是不支持hadoop 3.x版本。但是CDH中hive 2.1.1-cdh6.3.2版本和社区版本是不一样的,可以支持hadoop 3.x版本

二、基于3.0.0-cdh6.3.2版本编译Flink1.14.4

1、环境及软件准备

  1. Centos7.9 (非必须,建议在Linux环境编译,少撞坑..)
  2. maven-3.9.2
  3. flink-1.14.4

2、编译Flink1.14.4

  1. maven-3.9.2
  1. vim maven-3.9.2/conf/settings.xml
  2. <settings xmlns="http://maven.apache.org/SETTINGS/1.2.0"
  3. xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  4. xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.2.0 https://maven.apache.org/xsd/settings-1.2.0.xsd">
  5. <localRepository>/root/maven-3.9.2/repo</localRepository>
  6. ......
  7. ......
  8. ......
  9. <mirrors>
  10. <mirror>
  11. <id>maven-default-http-blocker</id>
  12. <mirrorOf>external:http:*</mirrorOf>
  13. <name>Pseudo repository to mirror external repositories initially using HTTP.</name>
  14. <url>http://0.0.0.0/</url>
  15. <blocked>true</blocked>
  16. </mirror>
  17. <mirror>
  18. <id>conjars-https</id>
  19. <url>https://conjars.wensel.net/repo/</url>
  20. <mirrorOf>conjars</mirrorOf>
  21. </mirror>
  22. <mirror>
  23. <id>conjars</id>
  24. <name>conjars</name>
  25. <url>https://conjars.wensel.net/repo/</url>
  26. <mirrorOf>conjarse</mirrorOf>
  27. </mirror>
  28. </mirrors>
  29. ......
  30. ......
  31. ......

 2. flink-1.14.4  下载、解压

vim /root/flink-1.14.4/pom.xml

配置cdh仓库地址

        <repositories>
                <repository>
                        <id>cloudera</id>
                        <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
                </repository>
        </repositories>

修改hadoop版本为cdh对应的hadoop版本

<hadoop.version>3.0.0-cdh6.3.2</hadoop.version>

vim /root/flink-1.14.4/flink-connectors/flink-sql-connector-hive-2.2.0/pom.xml

将/flink-sql-connector-hive-2.2.0中的 hive-exec 修改版本为cdh对应的hive版本2.1.1-cdh6.3.2

开始编译

mvn clean install -DskipTests -Dfast -Dhadoop.version=3.0.0-cdh6.3.2

编译结果

  1. [INFO] --- install:2.5.2:install (default-install) @ java-ci-tools ---
  2. [INFO] Installing /root/flink-1.14.4/tools/ci/java-ci-tools/target/java-ci-tools-1.14.4.jar to /root/.m2/repository/org/apache/flink/java-ci-tools/1.14.4/java-ci-tools-1.14.4.jar
  3. [INFO] Installing /root/flink-1.14.4/tools/ci/java-ci-tools/target/dependency-reduced-pom.xml to /root/.m2/repository/org/apache/flink/java-ci-tools/1.14.4/java-ci-tools-1.14.4.pom
  4. [INFO] Installing /root/flink-1.14.4/tools/ci/java-ci-tools/target/java-ci-tools-1.14.4-tests.jar to /root/.m2/repository/org/apache/flink/java-ci-tools/1.14.4/java-ci-tools-1.14.4-tests.jar
  5. [INFO] ------------------------------------------------------------------------
  6. [INFO] Reactor Summary for Flink : 1.14.4:
  7. [INFO]
  8. [INFO] Flink : ............................................ SUCCESS [ 59.471 s]
  9. [INFO] Flink : Annotations ................................ SUCCESS [ 16.499 s]
  10. [INFO] Flink : Test utils : ............................... SUCCESS [ 0.048 s]
  11. [INFO] Flink : Test utils : Junit ......................... SUCCESS [ 0.756 s]
  12. [INFO] Flink : Metrics : .................................. SUCCESS [ 0.051 s]
  13. [INFO] Flink : Metrics : Core ............................. SUCCESS [ 32.274 s]
  14. [INFO] Flink : Core ....................................... SUCCESS [01:24 min]
  15. [INFO] Flink : Java ....................................... SUCCESS [ 19.666 s]
  16. [INFO] Flink : Scala ...................................... SUCCESS [07:07 min]
  17. [INFO] Flink : FileSystems : .............................. SUCCESS [ 0.035 s]
  18. [INFO] Flink : FileSystems : Hadoop FS .................... SUCCESS [03:25 min]
  19. [INFO] Flink : FileSystems : Mapr FS ...................... SUCCESS [ 0.701 s]
  20. [INFO] Flink : FileSystems : Hadoop FS shaded ............. SUCCESS [ 56.192 s]
  21. [INFO] Flink : FileSystems : S3 FS Base ................... SUCCESS [ 22.125 s]
  22. [INFO] Flink : FileSystems : S3 FS Hadoop ................. SUCCESS [ 42.749 s]
  23. [INFO] Flink : FileSystems : S3 FS Presto ................. SUCCESS [03:29 min]
  24. [INFO] Flink : FileSystems : OSS FS ....................... SUCCESS [ 18.792 s]
  25. [INFO] Flink : FileSystems : Azure FS Hadoop .............. SUCCESS [03:40 min]
  26. [INFO] Flink : RPC : ...................................... SUCCESS [ 0.032 s]
  27. [INFO] Flink : RPC : Core ................................. SUCCESS [ 0.213 s]
  28. [INFO] Flink : RPC : Akka ................................. SUCCESS [02:45 min]
  29. [INFO] Flink : RPC : Akka-Loader .......................... SUCCESS [ 2.896 s]
  30. [INFO] Flink : Queryable state : .......................... SUCCESS [ 0.028 s]
  31. [INFO] Flink : Queryable state : Client Java .............. SUCCESS [ 0.287 s]
  32. [INFO] Flink : Runtime .................................... SUCCESS [02:21 min]
  33. [INFO] Flink : Optimizer .................................. SUCCESS [ 1.279 s]
  34. [INFO] Flink : Connectors : ............................... SUCCESS [ 0.047 s]
  35. [INFO] Flink : Connectors : File Sink Common .............. SUCCESS [ 0.198 s]
  36. [INFO] Flink : Streaming Java ............................. SUCCESS [ 7.293 s]
  37. [INFO] Flink : Clients .................................... SUCCESS [ 17.004 s]
  38. [INFO] Flink : DSTL ....................................... SUCCESS [ 0.029 s]
  39. [INFO] Flink : DSTL : DFS ................................. SUCCESS [ 0.352 s]
  40. [INFO] Flink : State backends : ........................... SUCCESS [ 0.028 s]
  41. [INFO] Flink : State backends : RocksDB ................... SUCCESS [04:48 min]
  42. [INFO] Flink : State backends : Changelog ................. SUCCESS [ 0.409 s]
  43. [INFO] Flink : Test utils : Utils ......................... SUCCESS [ 36.604 s]
  44. [INFO] Flink : Runtime web ................................ SUCCESS [03:45 min]
  45. [INFO] Flink : Test utils : Connectors .................... SUCCESS [ 0.129 s]
  46. [INFO] Flink : Connectors : Base .......................... SUCCESS [ 0.422 s]
  47. [INFO] Flink : Connectors : Files ......................... SUCCESS [ 0.668 s]
  48. [INFO] Flink : Examples : ................................. SUCCESS [ 0.060 s]
  49. [INFO] Flink : Examples : Batch ........................... SUCCESS [ 17.426 s]
  50. [INFO] Flink : Connectors : Hadoop compatibility .......... SUCCESS [ 4.132 s]
  51. [INFO] Flink : Tests ...................................... SUCCESS [ 41.403 s]
  52. [INFO] Flink : Streaming Scala ............................ SUCCESS [ 22.269 s]
  53. [INFO] Flink : Connectors : HCatalog ...................... SUCCESS [01:46 min]
  54. [INFO] Flink : Table : .................................... SUCCESS [ 0.062 s]
  55. [INFO] Flink : Table : Common ............................. SUCCESS [01:41 min]
  56. [INFO] Flink : Table : API Java ........................... SUCCESS [ 1.249 s]
  57. [INFO] Flink : Table : API Java bridge .................... SUCCESS [ 0.533 s]
  58. [INFO] Flink : Formats : .................................. SUCCESS [ 0.029 s]
  59. [INFO] Flink : Format : Common ............................ SUCCESS [ 0.070 s]
  60. [INFO] Flink : Table : API Scala .......................... SUCCESS [ 7.713 s]
  61. [INFO] Flink : Table : API Scala bridge ................... SUCCESS [ 7.706 s]
  62. [INFO] Flink : Table : SQL Parser ......................... SUCCESS [01:50 min]
  63. [INFO] Flink : Table : SQL Parser Hive .................... SUCCESS [ 12.423 s]
  64. [INFO] Flink : Table : Code Splitter ...................... SUCCESS [01:24 min]
  65. [INFO] Flink : Libraries : ................................ SUCCESS [ 0.034 s]
  66. [INFO] Flink : Libraries : CEP ............................ SUCCESS [ 1.634 s]
  67. [INFO] Flink : Table : Runtime ............................ SUCCESS [ 3.295 s]
  68. [INFO] Flink : Table : Planner ............................ SUCCESS [02:05 min]
  69. [INFO] Flink : Formats : Json ............................. SUCCESS [ 0.635 s]
  70. [INFO] Flink : Connectors : Elasticsearch base ............ SUCCESS [01:47 min]
  71. [INFO] Flink : Connectors : Elasticsearch 5 ............... SUCCESS [ 34.571 s]
  72. [INFO] Flink : Connectors : Elasticsearch 6 ............... SUCCESS [01:05 min]
  73. [INFO] Flink : Connectors : Elasticsearch 7 ............... SUCCESS [01:32 min]
  74. [INFO] Flink : Connectors : HBase base .................... SUCCESS [01:05 min]
  75. [INFO] Flink : Connectors : HBase 1.4 ..................... SUCCESS [02:10 min]
  76. [INFO] Flink : Connectors : HBase 2.2 ..................... SUCCESS [01:40 min]
  77. [INFO] Flink : Formats : Hadoop bulk ...................... SUCCESS [ 0.972 s]
  78. [INFO] Flink : Formats : Orc .............................. SUCCESS [ 9.371 s]
  79. [INFO] Flink : Formats : Orc nohive ....................... SUCCESS [ 9.061 s]
  80. [INFO] Flink : Formats : Avro ............................. SUCCESS [ 22.277 s]
  81. [INFO] Flink : Formats : Parquet .......................... SUCCESS [ 46.070 s]
  82. [INFO] Flink : Formats : Csv .............................. SUCCESS [ 0.426 s]
  83. [INFO] Flink : Connectors : Hive .......................... SUCCESS [04:35 min]
  84. [INFO] Flink : Connectors : JDBC .......................... SUCCESS [05:52 min]
  85. [INFO] Flink : Connectors : RabbitMQ ...................... SUCCESS [ 6.576 s]
  86. [INFO] Flink : Connectors : Twitter ....................... SUCCESS [ 25.245 s]
  87. [INFO] Flink : Connectors : Nifi .......................... SUCCESS [ 28.031 s]
  88. [INFO] Flink : Connectors : Cassandra ..................... SUCCESS [ 26.062 s]
  89. [INFO] Flink : Metrics : JMX .............................. SUCCESS [ 0.188 s]
  90. [INFO] Flink : Formats : Avro confluent registry .......... SUCCESS [ 55.983 s]
  91. [INFO] Flink : Test utils : Testing Framework ............. SUCCESS [ 5.095 s]
  92. [INFO] Flink : Connectors : Kafka ......................... SUCCESS [01:16 min]
  93. [INFO] Flink : Connectors : Google PubSub ................. SUCCESS [ 46.304 s]
  94. [INFO] Flink : Connectors : Kinesis ....................... SUCCESS [04:09 min]
  95. [INFO] Flink : Connectors : Pulsar ........................ SUCCESS [08:09 min]
  96. [INFO] Flink : Connectors : SQL : Elasticsearch 6 ......... SUCCESS [ 6.595 s]
  97. [INFO] Flink : Connectors : SQL : Elasticsearch 7 ......... SUCCESS [ 8.016 s]
  98. [INFO] Flink : Connectors : SQL : HBase 1.4 ............... SUCCESS [ 7.123 s]
  99. [INFO] Flink : Connectors : SQL : HBase 2.2 ............... SUCCESS [ 14.437 s]
  100. [INFO] Flink : Connectors : SQL : Hive 1.2.2 .............. SUCCESS [04:18 min]
  101. [INFO] Flink : Connectors : SQL : Hive 2.2.0 .............. SUCCESS [01:54 min]
  102. [INFO] Flink : Connectors : SQL : Hive 2.3.6 .............. SUCCESS [05:09 min]
  103. [INFO] Flink : Connectors : SQL : Hive 3.1.2 .............. SUCCESS [08:55 min]
  104. [INFO] Flink : Connectors : SQL : Kafka ................... SUCCESS [ 1.015 s]
  105. [INFO] Flink : Connectors : SQL : Kinesis ................. SUCCESS [ 9.111 s]
  106. [INFO] Flink : Formats : Sequence file .................... SUCCESS [ 0.436 s]
  107. [INFO] Flink : Formats : Compress ......................... SUCCESS [ 0.445 s]
  108. [INFO] Flink : Formats : Avro AWS Glue Schema Registry .... SUCCESS [02:30 min]
  109. [INFO] Flink : Formats : SQL Orc .......................... SUCCESS [ 0.267 s]
  110. [INFO] Flink : Formats : SQL Parquet ...................... SUCCESS [ 0.630 s]
  111. [INFO] Flink : Formats : SQL Avro ......................... SUCCESS [ 1.222 s]
  112. [INFO] Flink : Formats : SQL Avro Confluent Registry ...... SUCCESS [ 5.252 s]
  113. [INFO] Flink : Examples : Streaming ....................... SUCCESS [ 15.564 s]
  114. [INFO] Flink : Examples : Table ........................... SUCCESS [ 6.001 s]
  115. [INFO] Flink : Examples : Build Helper : .................. SUCCESS [ 0.062 s]
  116. [INFO] Flink : Examples : Build Helper : Streaming Twitter SUCCESS [ 0.495 s]
  117. [INFO] Flink : Examples : Build Helper : Streaming State machine SUCCESS [ 0.482 s]
  118. [INFO] Flink : Examples : Build Helper : Streaming Google PubSub SUCCESS [ 14.322 s]
  119. [INFO] Flink : Container .................................. SUCCESS [ 0.174 s]
  120. [INFO] Flink : Queryable state : Runtime .................. SUCCESS [ 0.441 s]
  121. [INFO] Flink : Kubernetes ................................. SUCCESS [01:24 min]
  122. [INFO] Flink : Yarn ....................................... SUCCESS [ 1.518 s]
  123. [INFO] Flink : Libraries : Gelly .......................... SUCCESS [ 1.530 s]
  124. [INFO] Flink : Libraries : Gelly scala .................... SUCCESS [ 10.840 s]
  125. [INFO] Flink : Libraries : Gelly Examples ................. SUCCESS [ 13.025 s]
  126. [INFO] Flink : External resources : ....................... SUCCESS [ 0.023 s]
  127. [INFO] Flink : External resources : GPU ................... SUCCESS [ 0.144 s]
  128. [INFO] Flink : Metrics : Dropwizard ....................... SUCCESS [ 0.162 s]
  129. [INFO] Flink : Metrics : Graphite ......................... SUCCESS [ 2.120 s]
  130. [INFO] Flink : Metrics : InfluxDB ......................... SUCCESS [ 39.168 s]
  131. [INFO] Flink : Metrics : Prometheus ....................... SUCCESS [ 12.082 s]
  132. [INFO] Flink : Metrics : StatsD ........................... SUCCESS [ 0.118 s]
  133. [INFO] Flink : Metrics : Datadog .......................... SUCCESS [ 0.252 s]
  134. [INFO] Flink : Metrics : Slf4j ............................ SUCCESS [ 0.106 s]
  135. [INFO] Flink : Libraries : CEP Scala ...................... SUCCESS [ 7.770 s]
  136. [INFO] Flink : Table : Uber ............................... SUCCESS [ 7.552 s]
  137. [INFO] Flink : Python ..................................... SUCCESS [10:14 min]
  138. [INFO] Flink : Table : SQL Client ......................... SUCCESS [ 10.765 s]
  139. [INFO] Flink : Libraries : State processor API ............ SUCCESS [ 0.733 s]
  140. [INFO] Flink : Scala shell ................................ SUCCESS [ 11.145 s]
  141. [INFO] Flink : Dist ....................................... SUCCESS [03:24 min]
  142. [INFO] Flink : Yarn Tests ................................. SUCCESS [ 3.867 s]
  143. [INFO] Flink : E2E Tests : ................................ SUCCESS [02:42 min]
  144. [INFO] Flink : E2E Tests : CLI ............................ SUCCESS [ 0.131 s]
  145. [INFO] Flink : E2E Tests : Parent Child classloading program SUCCESS [ 0.119 s]
  146. [INFO] Flink : E2E Tests : Parent Child classloading lib-package SUCCESS [ 0.103 s]
  147. [INFO] Flink : E2E Tests : Dataset allround ............... SUCCESS [ 0.119 s]
  148. [INFO] Flink : E2E Tests : Dataset Fine-grained recovery .. SUCCESS [ 0.147 s]
  149. [INFO] Flink : E2E Tests : Datastream allround ............ SUCCESS [ 0.710 s]
  150. [INFO] Flink : E2E Tests : Batch SQL ...................... SUCCESS [ 0.112 s]
  151. [INFO] Flink : E2E Tests : Stream SQL ..................... SUCCESS [ 0.122 s]
  152. [INFO] Flink : E2E Tests : Distributed cache via blob ..... SUCCESS [ 0.131 s]
  153. [INFO] Flink : E2E Tests : High parallelism iterations .... SUCCESS [ 7.028 s]
  154. [INFO] Flink : E2E Tests : Stream stateful job upgrade .... SUCCESS [ 0.577 s]
  155. [INFO] Flink : E2E Tests : Queryable state ................ SUCCESS [ 1.534 s]
  156. [INFO] Flink : E2E Tests : Local recovery and allocation .. SUCCESS [ 0.152 s]
  157. [INFO] Flink : E2E Tests : Elasticsearch 5 ................ SUCCESS [ 5.104 s]
  158. [INFO] Flink : E2E Tests : Elasticsearch 6 ................ SUCCESS [ 2.661 s]
  159. [INFO] Flink : Quickstart : ............................... SUCCESS [ 0.543 s]
  160. [INFO] Flink : Quickstart : Java .......................... SUCCESS [01:24 min]
  161. [INFO] Flink : Quickstart : Scala ......................... SUCCESS [ 0.071 s]
  162. [INFO] Flink : E2E Tests : Quickstart ..................... SUCCESS [ 0.315 s]
  163. [INFO] Flink : E2E Tests : Confluent schema registry ...... SUCCESS [ 2.002 s]
  164. [INFO] Flink : E2E Tests : Stream state TTL ............... SUCCESS [ 5.917 s]
  165. [INFO] Flink : E2E Tests : SQL client ..................... SUCCESS [01:18 min]
  166. [INFO] Flink : E2E Tests : File sink ...................... SUCCESS [ 0.898 s]
  167. [INFO] Flink : E2E Tests : State evolution ................ SUCCESS [ 0.501 s]
  168. [INFO] Flink : E2E Tests : RocksDB state memory control ... SUCCESS [ 0.534 s]
  169. [INFO] Flink : E2E Tests : Common ......................... SUCCESS [ 0.578 s]
  170. [INFO] Flink : E2E Tests : Metrics availability ........... SUCCESS [ 0.165 s]
  171. [INFO] Flink : E2E Tests : Metrics reporter prometheus .... SUCCESS [ 0.197 s]
  172. [INFO] Flink : E2E Tests : Heavy deployment ............... SUCCESS [ 8.971 s]
  173. [INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SUCCESS [ 48.814 s]
  174. [INFO] Flink : E2E Tests : Streaming Kafka base ........... SUCCESS [ 0.183 s]
  175. [INFO] Flink : E2E Tests : Streaming Kafka ................ SUCCESS [ 8.357 s]
  176. [INFO] Flink : E2E Tests : Plugins : ...................... SUCCESS [ 0.049 s]
  177. [INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SUCCESS [ 0.093 s]
  178. [INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SUCCESS [ 0.147 s]
  179. [INFO] Flink : E2E Tests : TPCH ........................... SUCCESS [ 5.189 s]
  180. [INFO] Flink : E2E Tests : Streaming Kinesis .............. SUCCESS [ 17.962 s]
  181. [INFO] Flink : E2E Tests : Elasticsearch 7 ................ SUCCESS [ 3.020 s]
  182. [INFO] Flink : E2E Tests : Common Kafka ................... SUCCESS [ 1.408 s]
  183. [INFO] Flink : E2E Tests : TPCDS .......................... SUCCESS [ 0.366 s]
  184. [INFO] Flink : E2E Tests : Netty shuffle memory control ... SUCCESS [ 0.132 s]
  185. [INFO] Flink : E2E Tests : Python ......................... SUCCESS [ 6.509 s]
  186. [INFO] Flink : E2E Tests : HBase .......................... SUCCESS [ 1.420 s]
  187. [INFO] Flink : E2E Tests : AWS Glue Schema Registry ....... SUCCESS [ 19.292 s]
  188. [INFO] Flink : E2E Tests : Pulsar ......................... SUCCESS [ 1.578 s]
  189. [INFO] Flink : State backends : Heap spillable ............ SUCCESS [ 0.270 s]
  190. [INFO] Flink : Contrib : .................................. SUCCESS [ 0.022 s]
  191. [INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [ 2.737 s]
  192. [INFO] Flink : FileSystems : Tests ........................ SUCCESS [ 0.718 s]
  193. [INFO] Flink : Docs ....................................... SUCCESS [ 8.164 s]
  194. [INFO] Flink : Walkthrough : .............................. SUCCESS [ 0.026 s]
  195. [INFO] Flink : Walkthrough : Common ....................... SUCCESS [ 0.232 s]
  196. [INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [ 0.061 s]
  197. [INFO] Flink : Walkthrough : Datastream Scala ............. SUCCESS [ 0.059 s]
  198. [INFO] Flink : Tools : CI : Java .......................... SUCCESS [ 49.851 s]
  199. [INFO] ------------------------------------------------------------------------
  200. [INFO] BUILD SUCCESS
  201. [INFO] ------------------------------------------------------------------------
  202. [INFO] Total time: 02:18 h
  203. [INFO] Finished at: 2023-05-25T13:05:59+08:00
  204. [INFO] ------------------------------------------------------------------------
  205. [WARNING]
  206. [WARNING] Plugin validation issues were detected in 38 plugin(s)
  207. [WARNING]
  208. [WARNING] * org.apache.maven.plugins:maven-compiler-plugin:3.8.0
  209. [WARNING] * com.github.eirslett:frontend-maven-plugin:1.9.1
  210. [WARNING] * org.apache.maven.plugins:maven-checkstyle-plugin:2.17
  211. [WARNING] * org.apache.maven.plugins:maven-clean-plugin:3.1.0
  212. [WARNING] * org.apache.maven.plugins:maven-assembly-plugin:3.0.0
  213. [WARNING] * com.github.os72:protoc-jar-maven-plugin:3.11.4
  214. [WARNING] * com.github.siom79.japicmp:japicmp-maven-plugin:0.11.0
  215. [WARNING] * org.codehaus.gmavenplus:gmavenplus-plugin:1.8.1
  216. [WARNING] * org.apache.maven.plugins:maven-jar-plugin:2.4
  217. [WARNING] * org.apache.maven.plugins:maven-antrun-plugin:1.8
  218. [WARNING] * org.apache.maven.plugins:maven-clean-plugin:2.5
  219. [WARNING] * org.apache.maven.plugins:maven-install-plugin:2.5.2
  220. [WARNING] * org.apache.maven.plugins:maven-jar-plugin:2.5
  221. [WARNING] * net.alchim31.maven:scala-maven-plugin:3.2.2
  222. [WARNING] * org.apache.avro:avro-maven-plugin:1.10.0
  223. [WARNING] * org.antlr:antlr3-maven-plugin:3.5.2
  224. [WARNING] * com.googlecode.fmpp-maven-plugin:fmpp-maven-plugin:1.0
  225. [WARNING] * org.antlr:antlr4-maven-plugin:4.7
  226. [WARNING] * org.apache.maven.plugins:maven-antrun-plugin:1.7
  227. [WARNING] * com.diffplug.spotless:spotless-maven-plugin:2.4.2
  228. [WARNING] * org.apache.maven.plugins:maven-assembly-plugin:2.4
  229. [WARNING] * org.apache.maven.plugins:maven-remote-resources-plugin:1.5
  230. [WARNING] * org.apache.maven.plugins:maven-archetype-plugin:2.2
  231. [WARNING] * org.apache.maven.plugins:maven-shade-plugin:3.1.1
  232. [WARNING] * org.scalastyle:scalastyle-maven-plugin:1.0.0
  233. [WARNING] * pl.project13.maven:git-commit-id-plugin:4.0.2
  234. [WARNING] * org.apache.maven.plugins:maven-surefire-plugin:2.22.2
  235. [WARNING] * org.apache.maven.plugins:maven-surefire-plugin:2.22.1
  236. [WARNING] * org.apache.maven.plugins:maven-enforcer-plugin:3.0.0-M1
  237. [WARNING] * org.apache.maven.plugins:maven-dependency-plugin:3.1.1
  238. [WARNING] * org.apache.maven.plugins:maven-source-plugin:3.0.1
  239. [WARNING] * org.apache.maven.plugins:maven-resources-plugin:3.1.0
  240. [WARNING] * org.commonjava.maven.plugins:directory-maven-plugin:0.1
  241. [WARNING] * org.apache.maven.plugins:maven-surefire-plugin:2.19.1
  242. [WARNING] * org.codehaus.mojo:exec-maven-plugin:1.5.0
  243. [WARNING] * org.xolstice.maven.plugins:protobuf-maven-plugin:0.5.1
  244. [WARNING] * org.xolstice.maven.plugins:protobuf-maven-plugin:0.6.1
  245. [WARNING] * org.apache.rat:apache-rat-plugin:0.12
  246. [WARNING]
  247. [WARNING] For more or less details, use 'maven.plugin.validation' property with one of the values (case insensitive): [BRIEF, DEFAULT, VERBOSE]
  248. [WARNING]
  249. [root@TCT003 flink-1.14.4]#

三、测试

1.获取编译后的Flink-1.14.4安装包

        /root/flink-1.14.4/flink-dist/target/flink-1.14.4-bin 

2. flink的lib包配置

包的来源:

flink-sql-connector-hive-2.2.0_2.11-1.14.4.jar

        /root/flink-1.14.4/flink-connectors/flink-sql-connector-hive-2.2.0/target/flink-sql-connector-hive-2.2.0_2.11-1.14.4.jar

flink-connector-hive_2.11-1.14.4.jar 

        /root/flink-1.14.4/flink-connectors/flink-connector-hive/target/flink-connector-hive_2.11-1.14.4.jar

libfb303-0.9.3.jar

         cdh lib 包下获取即可

3.sqlClient客户端验证

说明:我已经配置flink的standalone集群,在此就不在赘述了

  1. [root@TCT001 bin]# cat /opt/flink-1.14.4/conf/sql-conf.sql
  2. CREATE CATALOG my_hive WITH (
  3. 'type' = 'hive',
  4. 'hive-version' = '2.1.1',
  5. 'default-database' = 'default',
  6. 'hive-conf-dir' = '/etc/hive/conf.cloudera.hive/',
  7. 'hadoop-conf-dir'='/etc/hadoop/conf.cloudera.hdfs/'
  8. );
  9. -- set the HiveCatalog as the current catalog of the session
  10. USE CATALOG my_hive;
  11. [root@TCT001 bin]#
  12. [root@TCT001 bin]#
  13. [root@TCT001 bin]#
  14. [root@TCT001 bin]# ./sql-client.sh embedded -i ../conf/sql-conf.sql
  15. Setting HBASE_CONF_DIR=/etc/hbase/conf because no HBASE_CONF_DIR was set.
  16. SLF4J: Class path contains multiple SLF4J bindings.
  17. SLF4J: Found binding in [jar:file:/opt/flink-1.14.4/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  18. SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  19. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  20. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
  21. Successfully initialized from sql script: file:/opt/flink-1.14.4/bin/../conf/sql-conf.sql
  22. Command history file path: /root/.flink-sql-history
  23. ▒▓██▓██▒
  24. ▓████▒▒█▓▒▓███▓▒
  25. ▓███▓░░ ▒▒▒▓██▒ ▒
  26. ░██▒ ▒▒▓▓█▓▓▒░ ▒████
  27. ██▒ ░▒▓███▒ ▒█▒█▒
  28. ░▓█ ███ ▓░▒██
  29. ▓█ ▒▒▒▒▒▓██▓░▒░▓▓█
  30. █░ █ ▒▒░ ███▓▓█ ▒█▒▒▒
  31. ████░ ▒▓█▓ ██▒▒▒ ▓███▒
  32. ░▒█▓▓██ ▓█▒ ▓█▒▓██▓ ░█░
  33. ▓░▒▓████▒ ██ ▒█ █▓░▒█▒░▒█▒
  34. ███▓░██▓ ▓█ █ █▓ ▒▓█▓▓█▒
  35. ░██▓ ░█░ █ █▒ ▒█████▓▒ ██▓░▒
  36. ███░ ░ █░ ▓ ░█ █████▒░░ ░█░▓ ▓░
  37. ██▓█ ▒▒▓▒ ▓███████▓░ ▒█▒ ▒▓ ▓██▓
  38. ▒██▓ ▓█ █▓█ ░▒█████▓▓▒░ ██▒▒ █ ▒ ▓█▒
  39. ▓█▓ ▓█ ██▓ ░▓▓▓▓▓▓▓▒ ▒██▓ ░█▒
  40. ▓█ █ ▓███▓▒░ ░▓▓▓███▓ ░▒░ ▓█
  41. ██▓ ██▒ ░▒▓▓███▓▓▓▓▓██████▓▒ ▓███ █
  42. ▓███▒ ███ ░▓▓▒░░ ░▓████▓░ ░▒▓▒ █▓
  43. █▓▒▒▓▓██ ░▒▒░░░▒▒▒▒▓██▓░ █▓
  44. ██ ▓░▒█ ▓▓▓▓▒░░ ▒█▓ ▒▓▓██▓ ▓▒ ▒▒▓
  45. ▓█▓ ▓▒█ █▓░ ░▒▓▓██▒ ░▓█▒ ▒▒▒░▒▒▓█████▒
  46. ██░ ▓█▒█▒ ▒▓▓▒ ▓█ █░ ░░░░ ░█▒
  47. ▓█ ▒█▓ ░ █░ ▒█ █▓
  48. █▓ ██ █░ ▓▓ ▒█▓▓▓▒█░
  49. █▓ ░▓██░ ▓▒ ▓█▓▒░░░▒▓█░ ▒█
  50. ██ ▓█▓░ ▒ ░▒█▒██▒ ▓▓
  51. ▓█▒ ▒█▓▒░ ▒▒ █▒█▓▒▒░░▒██
  52. ░██▒ ▒▓▓▒ ▓██▓▒█▒ ░▓▓▓▓▒█▓
  53. ░▓██▒ ▓░ ▒█▓█ ░░▒▒▒
  54. ▒▓▓▓▓▓▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒░░▓▓ ▓░▒█░
  55. ______ _ _ _ _____ ____ _ _____ _ _ _ BETA
  56. | ____| (_) | | / ____|/ __ \| | / ____| (_) | |
  57. | |__ | |_ _ __ | | __ | (___ | | | | | | | | |_ ___ _ __ | |_
  58. | __| | | | '_ \| |/ / \___ \| | | | | | | | | |/ _ \ '_ \| __|
  59. | | | | | | | | < ____) | |__| | |____ | |____| | | __/ | | | |_
  60. |_| |_|_|_| |_|_|\_\ |_____/ \___\_\______| \_____|_|_|\___|_| |_|\__|
  61. Welcome! Enter 'HELP;' to list all available commands. 'QUIT;' to exit.
  62. Flink SQL> show tables;
  63. +------------+
  64. | table name |
  65. +------------+
  66. | hive_k |
  67. | k |
  68. +------------+
  69. 2 rows in set
  70. Flink SQL> set table.sql-dialect=hive;
  71. [INFO] Session property has been set.
  72. Flink SQL> select * from hive_k;
  73. 2023-05-25 18:07:14,422 INFO org.apache.hadoop.mapred.FileInputFormat [] - Total input files to process : 0
  74. SQL Query Result (Table)
  75. Table program finished. Page: Last of 1 Updated: 18:07:19.966
  76. tinyint0 smallint1 int2 bigint3 float4 double5 decimal6 boolean7 char8 varchar9 stri

四、相关资源下载

如跟我版本一样的码友们,可以直接下载去用

flink-1.14.4.tgz     下载

链接:https://pan.baidu.com/s/1L3idoPOs_LzgamOt9e5rZg?pwd=gqj0 
提取码:gqj0

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/小蓝xlanll/article/detail/494093
推荐阅读
相关标签
  

闽ICP备14008679号