当前位置:   article > 正文

Kafka-配置Kerberos安全认证(JDK8、JDK11)_kafka kerberos认证

kafka kerberos认证

一、相关配置

1、JAAS 配置文件

  1. KafkaClient {
  2. com.sun.security.auth.module.Krb5LoginModule required
  3. useKeyTab=true
  4. storeKey=true
  5. serviceName="kafka"
  6. keyTab="D:/code/demo/conf/kafka.service.keytab"
  7. principal="kafka/hdp-1";
  8. };

2、keytab 文件(kafka.service.keytab)

从 Kerberos 服务器上拷贝到目标机器 或 找运维人员要一份

3、Kerberos 配置文件(krb5.conf)

krb5文件参数说明:krb5.conf(5)

从 Kerberos 服务器上拷贝到目标机器 或 找运维人员要一份

  1. # Configuration snippets may be placed in this directory as well
  2. # JDK11此行配置要去掉
  3. includedir /etc/krb5.conf.d/
  4. [logging]
  5. default = FILE:/var/log/krb5libs.log
  6. kdc = FILE:/var/log/krb5kdc.log
  7. admin_server = FILE:/var/log/kadmind.log
  8. [libdefaults]
  9. default_realm = HADOOP.COM
  10. dns_lookup_realm = false
  11. dns_lookup_kdc = false
  12. ticket_lifetime = 24h
  13. renew_lifetime = 7d
  14. forwardable = true
  15. rdns = false
  16. udp_preference_limit = 1
  17. [realms]
  18. HADOOP.COM = {
  19. kdc = hdp-1:88
  20. admin_server = hdp-1:749
  21. default_domain = HADOOP.COM
  22. }
  23. [domain_realm]
  24. .HADOOP.COM = HADOOP.COM
  25. HADOOP.COM = HADOOP.COM

Tip:JDK11版本 sun.security.krb5.Config 类有修改,不去掉会有如下报错:

Caused by: KrbException: krb5.conf loading failed

 readConfigFileLines方法:

二、修改hosts文件

192.168.16.14  hdp-1

三、根据自己的kafka版本引入依赖

  1. <!-- 需要引入与所安装的kafka对应版本的依赖 -->
  2. <dependency>
  3. <groupId>org.apache.kafka</groupId>
  4. <artifactId>kafka-clients</artifactId>
  5. <version>3.1.0</version>
  6. </dependency>

四、生产者样例代码

  1. package com.example.demo.kafka;
  2. import org.apache.kafka.clients.producer.KafkaProducer;
  3. import org.apache.kafka.clients.producer.Producer;
  4. import org.apache.kafka.clients.producer.ProducerRecord;
  5. import java.util.Properties;
  6. /**
  7. * @Author: meng
  8. * @Version: 1.0
  9. */
  10. public class ProductKafkaKerberos {
  11. public static void main(String[] args) {
  12. String filePath = System.getProperty("user.dir") + "\\conf\\";
  13. System.setProperty("java.security.auth.login.config", filePath + "kafka_client_jaas.conf");
  14. System.setProperty("java.security.krb5.conf", filePath + "krb5.conf");
  15. Properties props = new Properties();
  16. props.put("bootstrap.servers", "hdp-1:9092");
  17. props.put("acks", "all");
  18. props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
  19. props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
  20. // sasl
  21. props.put("jaas.enabled", true);
  22. props.put("sasl.mechanism", "GSSAPI");
  23. props.put("security.protocol", "SASL_PLAINTEXT");
  24. props.put("sasl.kerberos.service.name", "kafka");
  25. Producer<String, String> producer = new KafkaProducer<>(props);
  26. for (int i = 0; i < 3; i++) {
  27. producer.send(new ProducerRecord<String, String>("test", Integer.toString(i), Integer.toString(i)));
  28. }
  29. System.out.println("producer is success");
  30. producer.close();
  31. }
  32. }

五、消费者样例代码

  1. package com.example.demo.kafka;
  2. import org.apache.kafka.clients.consumer.ConsumerRecord;
  3. import org.apache.kafka.clients.consumer.ConsumerRecords;
  4. import org.apache.kafka.clients.consumer.KafkaConsumer;
  5. import java.time.Duration;
  6. import java.util.Arrays;
  7. import java.util.Properties;
  8. /**
  9. * @Author: meng
  10. * @Version: 1.0
  11. */
  12. public class ConsumertKafkaKerberos {
  13. public static void main(String[] args) {
  14. String filePath = System.getProperty("user.dir") + "\\conf\\";
  15. System.setProperty("java.security.auth.login.config", filePath + "kafka_client_jaas.conf");
  16. System.setProperty("java.security.krb5.conf", filePath + "krb5.conf");
  17. Properties props = new Properties();
  18. props.put("bootstrap.servers", "hdp-1:9092");
  19. props.put("group.id", "test_group");
  20. props.put("enable.auto.commit", "true");
  21. props.put("auto.commit.interval.ms", "1000");
  22. props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
  23. props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
  24. // sasl
  25. props.put("sasl.mechanism", "GSSAPI");
  26. props.put("security.protocol", "SASL_PLAINTEXT");
  27. props.put("sasl.kerberos.service.name", "kafka");
  28. @SuppressWarnings("resource")
  29. KafkaConsumer<String, String> consumer = new KafkaConsumer<String, String>(props);
  30. String topic = "test";
  31. consumer.subscribe(Arrays.asList(topic));
  32. while (true) {
  33. try {
  34. ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(1000));
  35. for (ConsumerRecord<String, String> record : records) {
  36. System.out.printf("offset = %d, partition = %d, key = %s, value = %s%n",
  37. record.offset(), record.partition(), record.key(), record.value());
  38. }
  39. } catch (Exception e) {
  40. e.printStackTrace();
  41. }
  42. }
  43. }
  44. }

相关博客:https://www.cnblogs.com/myownswordsman/p/kafka-security-kerberos.htmlicon-default.png?t=N7T8https://www.cnblogs.com/myownswordsman/p/kafka-security-kerberos.html详解kerberos认证原理 | Hero前言Kerberos协议是一个专注于验证通信双方身份的网络协议,不同于其他网络安全协议的保证整个通信过程的传输安全,kerberos侧重于通信前双方身份的认定工作,帮助客户端和服务端解决“证明我自己是我自己”的问题,从而使得通信两端能够完全信任对方身份,在一个不安全的网络中完成一次安全的身份认证继而进行安全的通信。由于整个Kerberos认证过程较为复杂繁琐且网上版本较多,特整理此文以便复习与分享icon-default.png?t=N7T8https://seevae.github.io/2020/09/12/%E8%AF%A6%E8%A7%A3kerberos%E8%AE%A4%E8%AF%81%E6%B5%81%E7%A8%8B/

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/人工智能uu/article/detail/950991
推荐阅读
相关标签
  

闽ICP备14008679号