前言:刚上线elk一个月左右,使用的kafka作为缓存队列,但是不知道为何,中间发生过好几次,elk突然没数据了,长达好几天都没有,
折腾了好久,好了,过几天又发生同样的状况。经查找,数据是有到达kafka,但是logstash读取不了。无奈之下,只能把kafka
更换为redis。
filebeat配置
-------------
filebeat:
prospectors:
-
document_type: "web-hkgf-proxy-nginx-access"
paths:
- /data/logs/nginx/access/www2.access.log
-
document_type: "web-hkgf-proxy-nginx-error"
paths:
- /data/logs/nginx/error/www2.error.log
output.redis:
hosts: ["59.188.25.xxx:6379"]
key: "proxy-nginx-log"
db: 0
timeout: 5
shipper:
tags: ["web-hkgf-proxy-nginx-filebeat"]
redis安装配置
------------------
1. 使用yum安装redis,为了安装最新版本,需要添加yum库。
$ rpm -Uvh http://download.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
$ rpm -Uvh http://rpms.famillecollet.com/enterprise/remi-release-6.rpm
$ yum --enablerepo=remi,remi-test install redis
2. 配置
默认redis只侦听127.0.0.1,把bind更改为机器IP
logstash input
-------------------
input {
redis {
data_type => "list"
key => "proxy-nginx-log"
host => "59.188.25.xxx"
port => 6379
threads => 5
}
}