当前位置:   article > 正文

Flink的安装与开发环境搭建_flink1.8环境搭建

flink1.8环境搭建

Flink的安装

前提条件

  • HDFS正常启动(SSH免密人证)
  • JDK 1.8+

Flink的安装

  • 上传并解压flink
[root@CentOS ~]# tar -zxf flink-1.8.1-bin-scala_2.11.tgz  -C /usr
  • 1
  • 配置flink-conf.yaml配置文件
root@CentOS ~]# vi /usr/flink-1.8.1/conf/flink-conf.yaml
jobmanager.rpc.address: CentOS
taskmanager.numberOfTaskSlots: 4
parallelism.default: 3
  • 1
  • 2
  • 3
  • 4
  • 配置slaves
[root@CentOS ~]# vi /usr/flink-1.8.1/conf/slaves
CentOS
  • 1
  • 2
  • 启动flink服务
[root@CentOS flink-1.8.1]# ./bin/start-cluster.sh
Starting cluster.
Starting standalonesession daemon on host CentOS.
Starting taskexecutor daemon on host CentOS.
[root@CentOS flink-1.8.1]# jps
2912 Jps
2841 TaskManagerRunner
2397 StandaloneSessionClusterEntrypoint
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 访问:http://centos:8081
    在这里插入图片描述

Flink的入门案例

创建Maven工程,引入相关依赖

 <properties>
        <flink.version>1.8.1</flink.version>
        <scala.version>2.11</scala.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-scala_${scala.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <!--在执行package时候,将scala源码编译进jar-->
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>4.0.1</version>
                <executions>
                    <execution>
                        <id>scala-compile-first</id>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <!--将依赖jar打入到jar中-->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>2.4.3</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <filters>
                                <filter>
                                    <artifact>*:*</artifact>
                                    <excludes>
                                        <exclude>META-INF/*.SF</exclude>
                                        <exclude>META-INF/*.DSA</exclude>
                                        <exclude>META-INF/*.RSA</exclude>
                                    </excludes>
                                </filter>
                            </filters>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63

编写client程序

package com.hw.demo01
import org.apache.flink.streaming.api.scala.{DataStream, StreamExecutionEnvironment}
import org.apache.flink.streaming.api.scala._
/**
  * @aurhor:fql
  * @date 2019/10/14 20:03 
  * @type: flink的单词统计案例
  */
object WordCount_flink {
  def main(args: Array[String]): Unit = {

    //1.创建流处理的环境-远程发布|本地执行
    val fsEnv = StreamExecutionEnvironment.getExecutionEnvironment

    //2.读取外围系统数据-细化
    val lines:DataStream[String]=fsEnv.socketTextStream("CentOS",9999)
    lines.flatMap(_.split("\\s+"))
      .map((_,1))
      .keyBy(t=>t._1)
      .sum(1)
      .print()
    //3.执行计算
    fsEnv.execute("wordcount")
  }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25

将程序打包

[root@CentOS flink-1.8.1]# ./bin/flink run --class com.hw.demo01.WordCount_flink --detached --parallelism 3 /root/original-flink-1.0-SNAPSHOT.jar
Starting execution of program
Job has been submitted with JobID b7c2555ba5f9e847c426a18e8af0748f
  • 1
  • 2
  • 3

取消任务

[root@CentOS flink-1.8.1]# ./bin/flink cancel -m CentOS:8081 221d5fa916523f88741e2abf39453b81
Cancelling job 221d5fa916523f88741e2abf39453b81.
Cancelled job 221d5fa916523f88741e2abf39453b81.
  • 1
  • 2
  • 3

程序的部署方式

  • 脚本
[root@CentOS flink-1.8.1]# ./bin/flink run --class com.hw.demo01.WordCount_flink --detached --parallelism 3 /root/original-flink-1.0-SNAPSHOT.jar
  • 1
  • UI 界面提交

在这里插入图片描述

  • 跨平台
val jarFiles="flink\\target\\original-flink-1.0-SNAPSHOT.jar" //测试
val fsEnv = StreamExecutionEnvironment.createRemoteEnvironment("CentOS",8081,jarFiles)
  • 1
  • 2
  • 本地模拟
val fsEnv = StreamExecutionEnvironment.createLocalEnvironment(3)
或者
val fsEnv = StreamExecutionEnvironment.getExecutionEnvironment //自动识别运行环境,一般用于生产
  • 1
  • 2
  • 3
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/你好赵伟/article/detail/972292
推荐阅读
相关标签
  

闽ICP备14008679号