Filebeat利用Kafka进行日志实时传输
vim filebeat.yml
nohup ./filebeat -c filebeat.yml &
#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/applogs/*.log
fields:
type: appblog
multiline:
pattern: ^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{3}
negate: true
match: after
#================================ Outputs =====================================
output.kafka:
enabled: true
hosts: ["192.168.1.50:9092"]
topic: filebeat
192.168.1.50:9092
是单机kafka broker,如果是kafka集群,使用,
分隔。filebeat
是kafka topic,需改成实际情况的值。另外以下这段需要注释或删除:
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
# Optional protocol and basic auth credentials.
#protocol: "https"
#username: "elastic"
#password: "changeme"
因为并没有用到Elasticsearch,所以有多个输出在启动filebeat时会报错。
消息格式
{
"@timestamp":"2019-11-29T04:36:59.633Z",
"@metadata":{
"beat":"filebeat",
"type":"_doc",
"version":"7.1.0",
"topic":"filebeat"
},
"input":{
"type":"log"
},
"host":{
"name":"ip-192-168-1-26.ap-southeast-1.compute.internal",
"hostname":"ip-192-168-1-26.ap-southeast-1.compute.internal",
"architecture":"x86_64",
"os":{
"platform":"amzn",
"version":"2",
"family":"redhat",
"name":"Amazon Linux",
"kernel":"4.14.114-105.126.amzn2.x86_64",
"codename":"Karoo"
},
"id":"2c6f141f3ddb47649a45fa62264a610e",
"containerized":true
},
"agent":{
"ephemeral_id":"276f7eba-c8fa-4b78-87b4-4d42d16f0a96",
"hostname":"ip-192-168-1-26.ap-southeast-1.compute.internal",
"id":"239a0097-87dd-4f40-9245-a98ec8bc671e",
"version":"7.1.0",
"type":"filebeat"
},
"ecs":{
"version":"1.0.0"
},
"cloud":{
"availability_zone":"ap-southeast-1c",
"instance":{
"id":"i-09f5e800dac65a867"
},
"machine":{
"type":"m5.large"
},
"region":"ap-southeast-1",
"provider":"aws"
},
"log":{
"offset":172973,
"file":{
"path":"/var/log/applogs/appblog-common.log"
}
},
"message":"The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server."
}
版权声明:
作者:Joe.Ye
链接:https://www.appblog.cn/index.php/2023/03/25/filebeat-utilize-kafka-for-real-time-log-transmission/
来源:APP全栈技术分享
文章版权归作者所有,未经允许请勿转载。
THE END
0
二维码
打赏
海报
Filebeat利用Kafka进行日志实时传输
vim filebeat.yml
nohup ./filebeat -c filebeat.yml &
#=========================== Filebeat inputs =============================
filebeat.inputs:……
文章目录
关闭
共有 0 条评论