ELK介绍

  • Logstash负责从每台机器抓取日志数据,对数据进行格式转换和处理后,输出到Elasticsearch中存储。
  • Elasticsearch是一个分布式搜索引擎和分析引擎,用于数据存储,可提供实时的数据查询。
  • Kibana是一个数据可视化服务,根据用户的操作从Elasticsearch中查询数据,形成相应的分析结果,以图表的形式展现给用户。

架构

软件下载地址:
https://www.elastic.co/cn/downloads
由于ELS都使用java语言开发, 所以需要java环境

tar -zxf jdk1.8.0_101.tar.gz -C /usr/local/

cat <<EOF >> ~/etc/profile
export JAVA_HOME="/usr/local/jdk1.8.0_101"
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=$JAVA_HOME/bin:$PATH
EOF

source /etc/profile

Logstash部署

  • 在需要收集日志的服务器上安装Logstash服务
    运行
    bash
    tar zxf logstash-5.3.1.tar.gz
    cd logstash-5.3.1
    bin/logstash -f tomcat.conf

  • 解析tomcat日志, 通过Logstash的Grok插件处理日志
    日志格式

2017-05-04 14:39:11.350 DEBUG 7243 --- [nio-9002-exec-8] o.s.jdbc.core.JdbcTemplate               : Executing prepared SQL statement [select t.id, t.first_name, t.last_name,t.company_id,t.weixin_id,weixin_number,t.regionlevelone_id,t.regionleveltwo_id,t.regionlevelthree_id,t.member_company_id, t.position_cn,t.department, t.password, t.mobile, t.status, t.avatar,t.email,t.backup_email from member_member t where t.id=? and t.`status` in (1,2,3,4,5,6,10)]

logstash-tomcat.conf

input {
        file {
                type => "pc"
                path => ["/opt/tomcats/apache-tomcat-ncm-institu-pc-8.5.12/logs/catalina.out"]
                start_position => "end"
            codec => multiline {
                    pattern => "^(?<datetime>\d{4}-\d{2}-\d{2}\s\d{2}:\d{2})"
                    negate => true
                    what => "previous"
            }
            }

       file {
                type => "api"
                path => ["/opt/tomcats/apache-tomcat-ncm-institu-api-8.5.12/logs/catalina.out"]
                start_position => "end"
            codec => multiline {
                    pattern => "^(?<datetime>\d{4}-\d{2}-\d{2}\s\d{2}:\d{2})"
                    negate => true
                    what => "previous"
            }
        }
}

filter {
    grok {
        match => [ "message","(?<datetime>\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}.\d{3})\s+(?<Level>(\S+)).*"]
     }
}

output {
        elasticsearch {
                hosts => ["10.211.55.2:9200"]
                index => "logstash-%{+YYYY.MM.dd}"
  }
}
  • 将堆栈日志合并到同一行
    > 使用multiline插件启用多行日志, 当一行日志不以2017-05-04 14:39:11时间格式开头时,就合并到上一行
multiline {
        pattern => "^(?<datetime>\d{4}-\d{2}-\d{2}\s\d{2}:\d{2})"
        negate => true #设置true是向前匹配,设置false向后匹配,默认是FALSE
        what => "previous"

logstash-nginx.conf

input {
        file {
                type => "nginx"
                path => [ "/var/log/nginx/*access.log.[0-9]" ]
                start_position => "end"
                codec => json
                }
    file {
                type => "nginx"
                path => [ "/var/log/nginx/*access.log" ]
                start_position => "end"
                codec => json
                }
        }

filter {
        grok {
                match => [
                    "message", "%{IPV4:ClientIP} \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:method}) %{NOTSPACE:request} (?:HTTP/%{NUMBER:http_version})\" %{QS:referer} %{QS:ua} \"(?:%{NUMBER:bytes}|-)\" \"%{NUMBER:status}\" \"(?<requesttime>\d.\d+).*\" \"(%{IPV4}:%{POSINT}[, ]{0,2})+\" \"(%{NUMBER:upstream_status}|-)\" \"(?<uptime>\d.\d+).*\"",
                    "message", "%{IPV4:ClientIP} \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:method}) %{NOTSPACE:request} (?:HTTP/%{NUMBER:http_version})\" %{QS:referer} %{QS:ua} \"(?:%{NUMBER:bytes}|-)\" \"%{NUMBER:status}\" \"(?<requesttime>\d.\d+).*\" \"(?<UpstreamHost>.*)\" \"(?<upstream_status>.*)\" \"(?<uptime>.*)\""
                ]
        }
        geoip {
                source => "clientIP"
                }
        if ([message] =~ "favicon.ico") {
                drop {}
                }
        }

output {
        elasticsearch {
                hosts => ["10.27.10.163:9200"]
                index => "logstash-%{+YYYY.MM.dd}"
  }
}

ElasticSearch部署

  • 安装elasticsearch 按天存储Logstash处理过的日志
    运行
tar -zxf elasticsearch-5.3.1.tar.gz
cd elasticsearch-5.3.1
bin/elasticsearch -d

PS: 新版本ES必须使用非root用户启动

  • 索引
{"logstash-2017.05.04":{"aliases":{},"mappings":{"api":{"_all":{"enabled":true,"norms":false},"dynamic_templates":[{"message_field":{"path_match":"message","match_mapping_type":"string","mapping":{"norms":false,"type":"text"}}},{"string_fields":{"match":"*","match_mapping_type":"string","mapping":{"fields":{"keyword":{"type":"keyword"}},"norms":false,"type":"text"}}}],"properties":{"@timestamp":{"type":"date","include_in_all":false},"@version":{"type":"keyword","include_in_all":false},"Level":{"type":"text","norms":false,"fields":{"key我rd":{"type":"key我rd"}}},"datetime":{"type":"text","norms":false,"fields":{"key我rd":{"type":"key我rd"}}},"geoip":{"dynamic":"true","properties":{"ip":{"type":"ip"},"latitude":{"type":"half_float"},"location":{"type":"geo_point"},"longitude":{"type":"half_float"}}},"host":{"type":"text","norms":false,"fields":{"key我rd":{"type":"key我rd"}}},"message":{"type":"text","norms":false},"path":{"type":"text","norms":false,"fields":{"key我rd":{"type":"key我rd"}}},"type":{"type":"text","norms":false,"fields":{"key我rd":{"type":"key我rd"}}}}},"_default_":{"_all":{"enabled":true,"norms":false},"dynamic_templates":[{"message_field":{"path_match":"message","match_mapping_type":"string","mapping":{"norms":false,"type":"text"}}},{"string_fields":{"match":"*","match_mapping_type":"string","mapping":{"fields":{"key我rd":{"type":"key我rd"}},"norms":false,"type":"text"}}}],"properties":{"@timestamp":{"type":"date","include_in_all":false},"@version":{"type":"key我rd","include_in_all":false},"geoip":{"dynamic":"true","properties":{"ip":{"type":"ip"},"latitude":{"type":"half_float"},"location":{"type":"geo_point"},"longitude":{"type":"half_float"}}}}}},"settings":{"index":{"refresh_interval":"5s","number_of_shards":"5","provided_name":"logstash-2017.05.04","creation_date":"1493856703734","number_of_replicas":"1","uuid":"c7yhRFOdSfadux-nW8DMCQ","version":{"created":"5030199"}}}}}

Kibana部署

  • Kibana从ES中读取数据并解析后展示到web页面
tar zxf kibana-5.3.1.tar.gz
cd kibana-5.3.1
bin/kibana

修改config/kibana.yml
更改server.host: "127.0.0.1"server.host: "10.211.55.3"
更改elasticsearch.url: ""elasticsearch.url: "http://10.211.55.2:9200"
更改server.basePath: ""server.basePath: "/kibana"

  • 访问
    kibana默认端口为5601

nginx反向代理

location /kibana/ {
        proxy_pass http://10.211.55.3:5601/;
        auth_basic "Restricted";
        auth_basic_user_file /var/openresty/conf/logstash;
    }
}

增加访问验证

apt-get install apache2-utils
htpasswd -c -d /var/openresty/conf/logstash username

  • 创建索引
  • 查询语法

在Kibana的Discover页面中,可以输入一个查询条件来查询所需的数据。查询条件的写法使用的是Elasticsearch的Query String语法,而不是Query DSL,参考官方文档query-string-syntax,这里列举其中部分常用的:

单字段的全文检索,比如搜索args字段中包含first的文档,写作 args:first;

单字段的精确检索,比如搜索args字段值为first的文档,写作 args: “first”;

多个检索条件的组合,使用 NOT, AND 和 OR 来组合,注意必须是大写,比如 args:(“first” OR “second”) AND NOT agent: “third”;
字段是否存在,_exists_:agent表示要求agent字段存在,_missing_:agent表示要求agent字段不存在;
通配符:用 ? 表示单字母,* 表示任意个字母。

统计PV&UV