ELK环境搭建(一)

简介

ELK是三个开源软件的缩写,分别表示:Elasticsearch , Logstash, Kibana , 它们都是开源软件。新增了一个FileBeat,它是一个轻量级的日志收集处理工具(Agent),Filebeat占用资源少,适合于在各个服务器上搜集日志后传输给Logstash,官方也推荐此工具。html

Elasticsearch是个开源分布式搜索引擎,提供搜集、分析、存储数据三大功能。它的特色有:分布式,零配置,自动发现,索引自动分片,索引副本机制,restful风格接口,多数据源,自动搜索负载等。java

Logstash 主要是用来日志的搜集、分析、过滤日志的工具,支持大量的数据获取方式。通常工做方式为c/s架构,client端安装在须要收集日志的主机上,server端负责将收到的各节点日志进行过滤、修改等操做在一并发往elasticsearch上去。web

Kibana 也是一个开源和免费的工具,Kibana能够为 Logstash 和 ElasticSearch 提供的日志分析友好的 Web 界面,能够帮助汇总、分析和搜索重要数据日志。spring

安装

Elasticsearch json

下载地址:https://www.elastic.co/cn/downloads/elasticsearchbash

下载以后进行解压缩,进入bin目录elasticsearch.bat文件便可启动。服务器

官方文档配置地址:https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-http.htmlrestful

Kibana 架构

下载地址:https://www.elastic.co/cn/downloads/kibana并发

Logstash :

下载地址:https://www.elastic.co/cn/downloads/logstash

配置

logstash解压缩目录/conf目录下建立配置文件,名称随意

log.conf

input {  
  tcp {
      port => 4560
      codec => json_lines  
  }  
}
  
output {  
  elasticsearch { 
     //elasticsearch 地址
     hosts => ["localhost:9200"]  
     index => "applog"  
  }  
}

启动Elasticsearch

启动logstash

.\logstash -f C:\work\logstash-6.2.3\config\log.conf --debug

启动Kibana

在程序中增长logback的配置:logback-spring.xml

<?xml version="1.0" encoding="UTF-8"?>
<configuration scan="true" scanPeriod="10 seconds">

    <springProperty scope="context" name="springAppName"
                    source="spring.application.name" />

    <property name="CONSOLE_LOG_PATTERN"
              value="%date [%thread] %-5level %logger{36} - %msg%n" />

    <appender name="stdout" class="ch.qos.logback.core.ConsoleAppender">
        <withJansi>true</withJansi>
        <encoder>
            <pattern>${CONSOLE_LOG_PATTERN}</pattern>
            <charset>utf8</charset>
        </encoder>
    </appender>

    <appender name="logstash"
              class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>127.0.0.1:4560</destination>
        <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
            <providers>
                <timestamp>
                    <timeZone>UTC</timeZone>
                </timestamp>
                <pattern>
                    <pattern>
                        {
                        "severity":"%level",
                        "service": "${springAppName:-}",
                        "trace": "%X{X-B3-TraceId:-}",
                        "span": "%X{X-B3-SpanId:-}",
                        "exportable": "%X{X-Span-Export:-}",
                        "pid": "${PID:-}",
                        "thread": "%thread",
                        "class": "%logger{40}",
                        "rest": "%message"
                        }
                    </pattern>
                </pattern>
            </providers>
        </encoder>
    </appender>

    <appender name="dailyRollingFileAppender" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <File>main.log</File>
        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
            <FileNamePattern>main.%d{yyyy-MM-dd}.log</FileNamePattern>
            <maxHistory>30</maxHistory>
        </rollingPolicy>
        <encoder>
            <Pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{35} - %msg %n</Pattern>
        </encoder>
        <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
            <level>DEBUG</level>
        </filter>
    </appender>

    <springProfile name="!production">
        <logger name="com.example" level="DEBUG" />
        <logger name="org.springframework.web" level="INFO"/>
        <root level="info">
            <appender-ref ref="stdout" />
            <appender-ref ref="dailyRollingFileAppender" />
            <appender-ref ref="logstash" />
        </root>
    </springProfile>

    <springProfile name="production">
        <logger name="com.example" level="DEBUG" />
        <logger name="org.springframework.web" level="INFO"/>
        <root level="info">
            <appender-ref ref="stdout" />
            <appender-ref ref="dailyRollingFileAppender" />
            <appender-ref ref="logstash" />
        </root>
    </springProfile>
</configuration>

pom中增长logback的依赖

<dependency>
			<groupId>net.logstash.logback</groupId>
			<artifactId>logstash-logback-encoder</artifactId>
			<version>4.9</version>
		</dependency>

增长测试代码

package com.example.sharding.controller;


import com.dangdang.ddframe.rdb.sharding.id.generator.IdGenerator;
import com.example.sharding.entity.Order;
import com.example.sharding.service.OrderService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping("/order")
public class OrderController {

    private static  final Logger LOGGER =  LoggerFactory.getLogger(OrderController.class);

    @Autowired
    private IdGenerator idGenerator;
    @Autowired
    private OrderService orderService;

    @RequestMapping("/add")
    public Object add() {
        for (int i = 0; i < 50; i++) {
            Order order = new Order();
            order.setUserId(idGenerator.generateId().longValue());
            order.setOrderId(idGenerator.generateId().longValue());
            orderService.save(order);
            LOGGER.info(order.toString());
        }
        return "success";
    }

    @RequestMapping("query")
    private Object queryAll() {
        LOGGER.info("queryAll");
        return orderService.findAll();
    }

    @RequestMapping("deleteAll")
    private void deleteAll() {
        LOGGER.info("deleteAll");
        orderService.deleteAll();
    }
}

频繁访问接口记录日志以后,刷新Elasticsearch,能够看到有日志进来了

访问kabina

http://localhost:5601/app/kibana

相关文章
相关标签/搜索