当前位置:   article > 正文

Kafka0-10版本之00 Producer生产者的简单使用,往kafka循环打入数据_网kafka循环插入数据脚本

网kafka循环插入数据脚本
package Kafka010.Utils

import java.util.Properties

import org.apache.kafka.clients.producer.{KafkaProducer, ProducerConfig, ProducerRecord}

/**
 * Created by Shi shuai RollerQing on 2019/12/24 20:19
 */
object ProducerDemo {
  def main(args: Array[String]): Unit = {
    // 定义kafka的参数
    val brokers = "hadoop01:9092,hadoop02:9092,hadoop03:9092"
    val topic = "topicB"
    val prop = new Properties()

    //prop.put("bootstraps", brokers)
    //prop.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer")
    //prop.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer")
    prop.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, brokers)
    prop.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
    prop.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")

    // KafkaProducer
    val producer: KafkaProducer[String, String] = new KafkaProducer[String, String](prop)

    // KafkaRecorder
    // 异步
    for(i <- 1 to 100000){
      val msg = new ProducerRecord[String, String](topic, i.toString, i.toString)
      //发送消息
      producer.send(msg)
      println(s"i = $i")
      Thread.sleep(100)
    }
  }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
		<!-- spark-streaming-kafka-0-10 -->
		<dependency>
			<groupId>org.apache.spark</groupId>
			<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
			<version>2.2.0</version>
		</dependency>
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

在这里插入图片描述

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Gausst松鼠会/article/detail/614345
推荐阅读
相关标签
  

闽ICP备14008679号