Quality Monitoring Platform elk1. installation method:
- Elk image https://store.docker.com/community/images/sebp/elk
- Documents: https://elk-docker.readthedocs.io/
- Method 1: docker pull sebp/elk
- Method 2: docker pull registry.docker-cn.com/sebp/elk
2. Start elk
Sysctl-w vm. max_map_count = 262144
docker run -p 5601:5601 -p 9200:9200 -p 5044:5044 -d --name elk sebp/elk
3. Enter logs directly on the Interaction page.
The input content is output as a log.
Method 1: Enter the elk terminal and enter the log input command line.
''' # Shell
Docker Exec-It e4d3aa1_1b6/bin/bash
/Opt/logstash/bin/logstash -- path. data/tmp/logstash/data-e' input {stdin {}} output {elasticsearch {hosts => ["localhost"]}'
Enter any text
Enter discover in kibana, create index pattern, write the index he found, and set the time field to create
Search
Method 2: directly enter the log input command line without entering the terminal
Docker run-it -- name logstash -- RM logstash -- path. data/tmp/logstash/data-e' input {stdin {type => "Doc"} output {elasticsearch {hosts => ["172.16.40.200"]}'
Note: You must enter the IP address. You cannot enter localhost or 127.0.0.1. Otherwise, the error is as follows:
4. Batch import local data 4.1 to create the configuration file CONF/CSV. conf. Note the IP address ifconfig en0.
input { file { path => "/data/*.csv" start_position => beginning }}filter { csv{ columns =>[ "log_time", "user", "api", "status", "version"] } date { match => ["log_time", "yyyy-MM-dd HH:mm:ss"] timezone => "Asia/Shanghai" }}output { elasticsearch { hosts => ["172.16.40.200:9200"] index => "logstash-seveniruby-%{+YYYY.MM.dd}" }}
4.2 create data: Data/demo.csv
2018-10-28 11:29:00,chenshanju,topics.json,200,7.42018-10-28 11:29:01,chenshanju,topics.json,200,7.42018-10-28 11:29:02,chenshanju,topics/3.json,200,7.42018-10-28 11:30:01,chenshanju,topics/4.json,200,7.42018-10-28 11:30:20,chenshanju,topics/1.json,200,7.42018-10-28 11:40:20,chenshanju,topics/5.json,200,7.4
4.3 execution
docker run -it --name logstash --rm -v $PWD/conf:/conf -v /Users/chenshanju/Desktop/docker/data/:/data logstash -f /conf/csv.conf
4.4 note
The data and conf commands are also executed in the agreed directory.
4.4 continuous logs
Open another terminal and execute the following script
# Note: Do not forget to input userwhile truedoversion =$ ([$ (random % 5)-ge 1] & Echo debug | echo test) to this script) version =$ {version} _ 3. $ (random % 3) userlist = (chenshanju chenyi CSJ Java Python) User =$ {userlist [$ (random % 5)]} API = API/$ (random % 5 )). jsonstatus = $ (random % 5) 00ip = 192.168.0.1 $ (random % 5) $ (random % 5 )) echo $ (date + "% Y-% m-% d % H: % m: % s"), $ {user}, $ {IP}, $ {API }, $ {status}, $ {version} | tee-A $ (date between policyuncm1_d1_hsf-m=.csv sleep 0. $ (random % 5) done
FAQ: 1. Elk started successfully, but failed to pass in data. Because elk requires 4 GB memory, I only provide 2 GB for docker. Adjust the memory to 4 GB.
10-28 quality monitoring elk