当前位置:   article > 正文

logstash 7.11.1 收集clickhouse日志到elasticsearch_application: calculated checksum of the binary: 03

application: calculated checksum of the binary: 03a21b8ef25d04a4dcd8c0fca831

1、clickhouse日志样式:

  1. 2021.04.14 09:20:43.114711 [ 1 ] {} <Information> Application: Will watch for the process with pid 44
  2. 2021.04.14 09:20:43.114917 [ 44 ] {} <Information> Application: Forked a child process to watch
  3. 2021.04.14 09:20:43.115122 [ 44 ] {} <Information> SentryWriter: Sending crash reports is disabled
  4. 2021.04.14 09:20:43.115331 [ 44 ] {} <Trace> Pipe: Pipe capacity is 1.00 MiB
  5. 2021.04.14 09:20:43.172831 [ 44 ] {} <Information> : Starting ClickHouse 21.2.7.11 with revision 54447, build id: 982020A4FECFEEE237071F94D04B9D693ADFA78D, PID 44
  6. 2021.04.14 09:20:43.173005 [ 44 ] {} <Information> Application: starting up
  7. 2021.04.14 09:20:44.048112 [ 44 ] {} <Information> Application: Calculated checksum of the binary: C862DF75BDC516833103C0C73375024F, integrity check passed.
  8. 2021.04.14 09:20:44.048195 [ 44 ] {} <Information> Application: It looks like the process has no CAP_IPC_LOCK capability, binary mlock will be disabled. It could happen due to incorrect ClickHouse package installation. You could resolve the problem manually with 'sudo setcap cap_ipc_lock=+ep /usr/bin/clickhouse'. Note that it will not work on 'nosuid' mounted filesystems.
  9. 2021.04.14 09:20:44.048703 [ 44 ] {} <Debug> Application: rlimit on number of file descriptors is 65536
  10. 2021.04.14 09:20:44.048730 [ 44 ] {} <Debug> Application: Initializing DateLUT.
  11. 2021.04.14 09:20:44.048748 [ 44 ] {} <Trace> Application: Initialized DateLUT with time zone 'UTC'.
  12. 2021.04.14 09:20:44.048796 [ 44 ] {} <Debug> Application: Setting up /var/lib/clickhouse/tmp/ to store temporary data in it
  13. 2021.04.14 09:20:44.061626 [ 44 ] {} <Debug> Application: Configuration parameter 'interserver_http_host' doesn't exist or exists and empty. Will use 'ab06b1a21b9b' as replica host.

2、logstash7.x其于jdk 1.8+版本,请确保环境配置正常。logstash下载、解压忽略。

进入logstash-7.11.1/config下,复制logstash-sample.conf 命名为:logstash.conf,

修改logstash.conf,内容如下:

  1. # Sample Logstash configuration for creating a simple
  2. # Beats -> Logstash -> Elasticsearch pipeline.
  3. input {
  4. file #监测clickhouse目录下,生成的.log文件
  5. type => "clickhouselog"
  6. path => "/home/its/logs/clickhouse/*.log"
  7. discover_interval => 10
  8. start_position => "beginning"
  9. }
  10. }
  11. filter {
  12. if [type] == "clickhouselog" {
  13. grok {#利用正则,把日期时间取出,并赋值给time字段
  14. match => ["message", "(?<time>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{HOUR}:%{MINUTE}:%{SECOND})"]
  15. }
  16. grok {#截取<>内的内容作为日志级别,如:debug、Info、Trace
  17. match =>{
  18. "message" => "(?<logLevel>(?<=<).*?(?=>))"
  19. }
  20. }
  21. grok {#截取>以后的内容作为消息体内容
  22. match =>{
  23. "message" => "(?<logContent>(?<=>)(.*)/?)"
  24. }
  25. }
  26. ruby {#将时间转成日期类型并赋值给collet_time字段
  27. code => "
  28. event.set('collet_time', Time.parse(event.get('time')))
  29. "
  30. }
  31. # date {
  32. # match => [ "time","yyyy-MMM-dd HH:mm:ss Z","ISO8601"]
  33. # locale => "cn"
  34. # add_tag => "@timestamp"
  35. # timezone => "Asia/Shanghai"
  36. # }
  37. mutate {#移除临时字段,防止入到ES
  38. remove_field => [ "time" ]
  39. }
  40. }
  41. }
  42. output {#ES配置
  43. elasticsearch {
  44. index => "log-%{+YYYY.MM.dd}"
  45. hosts => ["192.168.100.41:8200"]
  46. }
  47. stdout {codec => rubydebug}
  48. }

3、启动logstash服务

  1. #进入logstash的bin目录:/home/soft/app/logstash-7.11.1/bin
  2. [root@localhost bin]# ./logstash -f /home/soft/app/logstash-7.11.1/config/logstash.conf

 出现下图表示日志写入正常

 

感谢以下文章作者:

https://developer.aliyun.com/article/154341

https://ruby-doc.org/stdlib-2.4.1/libdoc/date/rdoc/Date.html#method-i-to_datetime

https://www.junmajinlong.com/ruby/ruby_datetime/

https://blog.csdn.net/cai750415222/article/details/86614854

声明:本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:【wpsshop博客】
推荐阅读
相关标签
  

闽ICP备14008679号