티스토리 뷰
[ELK Stack] Elastic(ELK) Stack 구축하기(Beat, Logstash, ElasticSearch, Kibana)
ossians 2019. 1. 31. 23:52[ELK Stack] Elastic(ELK) Stack 구축하기(Beat, Logstash, ElasticSearch, Kibana)
Elastic(ELK) Stack이란?
사용자가 서버로부터 원하는 모든 데이터를 가져와서 실시간으로 해당 데이터에 대한 검색, 분석 및 시각화를 도와주는 Elastic의 제품입니다.
Elastic의 제품 중 Beat & Logstash & Elasticsearch & Kibana를 같이 묶어 Elastic Stack이란 이름의 서비스를 제공합니다.
Elastic(ELK) Stack Service 구성
Beats Server - (Log Push Client)
IP : 192.168.126.137
OS : CentOS 7
IP : 192.168.126.138
OS : Windows Server 2012 R2
Logstash Server
IP : 192.168.126.139
OS : CentOS 7
ElasticSearch & Kibana Server
IP : 192.168.126.141
OS : CentOS 7
ElasticSearch & Kibana Server 구성
Java JDK Install
[root@elasticsearch ~]# yum list java-*jdk Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: centos.mirror.cdnetworks.com * extras: mirror.kakao.com * updates: mirror.kakao.com Available Packages java-1.6.0-openjdk.x86_64 1:1.6.0.41-1.13.13.1.el7_3 base java-1.7.0-openjdk.x86_64 1:1.7.0.201-2.6.16.1.el7_6 updates java-1.8.0-openjdk.i686 1:1.8.0.191.b12-1.el7_6 updates java-1.8.0-openjdk.x86_64 1:1.8.0.191.b12-1.el7_6 updates java-11-openjdk.i686 1:11.0.1.13-3.el7_6 updates java-11-openjdk.x86_64 1:11.0.1.13-3.el7_6 updates |
[root@elasticsearch ~]# yum install -y java-1.8.0-openjdk.x86_64 Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: centos.mirror.cdnetworks.com * extras: mirror.kakao.com * updates: mirror.kakao.com Resolving Dependencies --> Running transaction check ---> Package java-1.8.0-openjdk.x86_64 1:1.8.0.191.b12-1.el7_6 will be installed ... nss-softokn-freebl.x86_64 0:3.36.0-5.el7_5 nss-sysinit.x86_64 0:3.36.0-7.el7_5 nss-tools.x86_64 0:3.36.0-7.el7_5 nss-util.x86_64 0:3.36.0-1.el7_5 Complete! |
ElasticSearch Install
[root@elasticsearch ~]# rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch [root@elasticsearch ~]# vi /etc/yum.repos.d/elasticsearch.repo # elasticsearch.repo [elasticsearch-6.x] name=Elasticsearch repository for 6.x packages baseurl=https://artifacts.elastic.co/packages/6.x/yum gpgcheck=1 gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch enabled=1 autorefresh=1 type=rpm-md [root@elasticsearch ~]# yum install -y elasticsearch Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: centos.mirror.cdnetworks.com * extras: mirror.kakao.com * updates: mirror.kakao.com elasticsearch-6.x | 1.3 kB 00:00:00 elasticsearch-6.x/primary | 150 kB 00:00:03 elasticsearch-6.x ... Installed: elasticsearch.noarch 0:6.6.0-1 Complete! |
ElasticSearch Configration & Start
[root@elasticsearch ~]# cd /etc/elasticsearch/ [root@elasticsearch elasticsearch]# ls elasticsearch.keystore jvm.options role_mapping.yml users elasticsearch.yml log4j2.properties roles.yml users_roles [root@elasticsearch elasticsearch]# vi elasticsearch.yml # elasticsearch.yml .... # ---------------------------------- Network ----------------------------------- # # Set the bind address to a specific IP (IPv4 or IPv6): # network.host: 0.0.0.0 # # Set a custom port for HTTP: # http.port: 9200 # # For more information, consult the network module documentation. # .... [root@elasticsearch elasticsearch]# systemctl restart elasticsearch |
ElasticSearch Service Start Check
[root@elasticsearch elasticsearch]# netstat -anpt | grep 9200 tcp6 0 0 :::9200 :::* LISTEN 1697/java [root@elasticsearch elasticsearch]# netstat -anpt | grep 9300 tcp6 0 0 :::9300 :::* LISTEN 1697/java |
Kibana Install
[root@elasticsearch ~]# rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch [root@elasticsearch ~]# vi /etc/yum.repos.d/kibana.repo # kibana.repo [kibana-6.x] name=Kibana repository for 6.x packages baseurl=https://artifacts.elastic.co/packages/6.x/yum gpgcheck=1 gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch enabled=1 autorefresh=1 type=rpm-md [root@elasticsearch ~]# yum install -y kibana Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: centos.mirror.cdnetworks.com * extras: mirror.kakao.com * updates: mirror.kakao.com kibana-6.x | 1.3 kB 00:00:00 kibana-6.x/primary | 150 kB 00:00:03 .... Installed: kibana.x86_64 0:6.6.0-1 Complete! |
Kibana Configration & Start
[root@elasticsearch ~]# cd /etc/kibana/ [root@elasticsearch kibana]# ls kibana.yml [root@elasticsearch kibana]# vi kibana.yml # kibana.yml # Kibana is served by a back end server. This setting specifies the port to use. server.port: 5601 # Specifies the address to which the Kibana server will bind. IP addresses and host names are both valid values. # The default is 'localhost', which usually means remote machines will not be able to connect. # To allow connections from remote users, set this parameter to a non-loopback address. server.host: "0.0.0.0" # Enables you to specify a path to mount Kibana at if you are running behind a proxy. # Use the `server.rewriteBasePath` setting to tell Kibana if it should remove the basePath # from requests it receives, and to prevent a deprecation warning at startup. # This setting cannot end in a slash. #server.basePath: "" # Specifies whether Kibana should rewrite requests that are prefixed with # `server.basePath` or require that they are rewritten by your reverse proxy. # This setting was effectively always `false` before Kibana 6.3 and will # default to `true` starting in Kibana 7.0. #server.rewriteBasePath: false # The maximum payload size in bytes for incoming server requests. #server.maxPayloadBytes: 1048576 # The Kibana server's name. This is used for display purposes. #server.name: "your-hostname" # The URLs of the Elasticsearch instances to use for all your queries. elasticsearch.hosts: ["http://192.168.126.141:9200"] # When this setting's value is true Kibana uses the hostname specified in the server.host # setting. When the value of this setting is false, Kibana uses the hostname of the host # that connects to this Kibana instance. #elasticsearch.preserveHost: true .... # Specifies locale to be used for all localizable strings, dates and number formats. #i18n.locale: "en" [root@elasticsearch kibana]# systemctl restart kibana |
kibana Service Start Check & Web Browser
[root@elasticsearch kibana]# netstat -anpt | grep 5601 tcp 0 0 0.0.0.0:5601 0.0.0.0:* LISTEN 1966/node |
http://192.168.126.141:5601 접속
ElasticSearch 연결 확인
Logstash Server 구성
Java JDK Install
[root@logstash ~]# yum list java-*jdk Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: centos.mirror.cdnetworks.com * extras: mirror.kakao.com * updates: mirror.kakao.com Available Packages java-1.6.0-openjdk.x86_64 1:1.6.0.41-1.13.13.1.el7_3 base java-1.7.0-openjdk.x86_64 1:1.7.0.201-2.6.16.1.el7_6 updates java-1.8.0-openjdk.i686 1:1.8.0.191.b12-1.el7_6 updates java-1.8.0-openjdk.x86_64 1:1.8.0.191.b12-1.el7_6 updates java-11-openjdk.i686 1:11.0.1.13-3.el7_6 updates java-11-openjdk.x86_64 1:11.0.1.13-3.el7_6 updates |
[root@logstash ~]# yum install -y java-1.8.0-openjdk.x86_64 Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: centos.mirror.cdnetworks.com * extras: mirror.kakao.com * updates: mirror.kakao.com Resolving Dependencies --> Running transaction check ---> Package java-1.8.0-openjdk.x86_64 1:1.8.0.191.b12-1.el7_6 will be installed ... nss-softokn-freebl.x86_64 0:3.36.0-5.el7_5 nss-sysinit.x86_64 0:3.36.0-7.el7_5 nss-tools.x86_64 0:3.36.0-7.el7_5 nss-util.x86_64 0:3.36.0-1.el7_5 Complete! |
Logstash Install
[root@logstash ~]# rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch\ [root@logstash ~]# vi /etc/yum.repos.d/logstash.repo # logstash.repo [logstash-6.x] name=Elastic repository for 6.x packages baseurl=https://artifacts.elastic.co/packages/6.x/yum gpgcheck=1 gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch enabled=1 autorefresh=1 type=rpm-md [root@logstash ~]# yum install -y logstash Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: mirror.kakao.com * extras: mirror.kakao.com * updates: mirror.kakao.com logstash-6.x | 1.3 kB 00:00:00 logstash-6.x/primary | 150 kB 00:00:03 ... Installed: logstash.noarch 1:6.6.0-1 Complete! |
logstash Configration & Start
[root@logstash ~]# cd /etc/logstash/ [root@logstash logstash]# ls conf.d jvm.options log4j2.properties logstash-sample.conf logstash.yml pipelines.yml startup.options [root@logstash logstash]# vi pipelines.yml # This file is where you define your pipelines. You can define multiple. # For more information on multiple pipelines, see the documentation: # https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html #- pipeline.id: main # path.config: "/etc/logstash/conf.d/*.conf" - pipeline.id: window path.config: "/etc/logstash/windows_conf.d/*.conf" - pipeline.id: linux path.config: "/etc/logstash/linux_conf.d/*.conf" [root@logstash logstash]# mkdir windows_conf.d [root@logstash logstash]# vi windows_conf.d/logstash.conf # windows_conf.d/logstash.conf ... input { beats { port => 5044 } } output { stdout { codec => json } elasticsearch { hosts => ["http://192.168.126.141:9200"] index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}" } } root@logstash logstash]# mkdir linux_conf.d [root@logstash logstash]# vi linux_conf.d/logstash.conf # linux_conf.d/logstash.conf ... input { beats { port => 50441 } } output { stdout { codec => json } elasticsearch { hosts => ["http://192.168.126.141:9200"] index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}" #user => "elastic" #password => "changeme" } } [root@logstash logstash]# ls conf.d linux_conf.d logstash-sample.conf pipelines.yml windows_conf.d jvm.options log4j2.properties logstash.yml startup.options [root@logstash logstash]# cd /usr/share/logstash/bin [root@logstash bin]# ls benchmark.sh ingest-convert.sh logstash-keystore logstash-plugin pqrepair system-install cpdump logstash logstash-keystore.bat logstash-plugin.bat ruby dependencies-report logstash.bat logstash.lib.sh pqcheck setup.bat [root@logstash bin]# ./system-install Successfully created system startup script for Logstash [root@logstash bin]# systemctl restart logstash |
Logstash Service Start Check
[root@logstash logstash]# netstat -anpt | grep 5044 tcp6 0 0 :::50441 :::* LISTEN 10171/java tcp6 0 0 :::5044 :::* LISTEN 10171/java |
Beats Server - (Log Push Client) 구성
OS : CentOS 7
Filebeat Install
[root@filebeat ~]# curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.6.0-x86_64.rpm % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 11.2M 100 11.2M 0 0 548k 0 0:00:21 0:00:21 --:--:-- 929k [root@filebeat ~]# sudo rpm -vi filebeat-6.6.0-x86_64.rpm warning: filebeat-6.6.0-x86_64.rpm: Header V4 RSA/SHA512 Signature, key ID d88e42b4: NOKEY Preparing packages... filebeat-6.6.0-1.x86_64 |
Filebeat Configration & Start
[root@filebeat filebeat]# vi filebeat.yml # filebeat.yml .... #=========================== Filebeat inputs ============================= filebeat.inputs: # Each - is an input. Most options can be set at the input level, so # you can use different inputs for various configurations. # Below are the input specific configurations. - type: log # Change to true to enable this input configuration. enabled: true # Paths that should be crawled and fetched. Glob based paths. paths: - /var/log/*.log - /var/log/httpd/*_log #- c:\programdata\elasticsearch\logs\* # Exclude lines. A list of regular expressions to match. It drops the lines that are ... #================================ Outputs ===================================== # Configure what output to use when sending the data collected by the beat. #-------------------------- Elasticsearch output ------------------------------ #output.elasticsearch: # Array of hosts to connect to. # hosts: ["localhost:9200"] # Enabled ilm (beta) to use index lifecycle management instead daily indices. #ilm.enabled: false # Optional protocol and basic auth credentials. #protocol: "https" #username: "elastic" #password: "changeme" #----------------------------- Logstash output -------------------------------- output.logstash: # The Logstash hosts hosts: ["192.168.126.139:50441"] # Optional SSL. By default is off. # List of root certificates for HTTPS server verifications #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"] # Certificate for SSL client authentication #ssl.certificate: "/etc/pki/client/cert.pem" # Client Certificate Key #ssl.key: "/etc/pki/client/cert.key" #================================ Processors ===================================== ... [root@filebeat filebeat]# systemctl restart filebeat |
Filebeat Service Start Check
[root@filebeat filebeat]# netstat -anpt | grep 50441 tcp 0 0 192.168.126.137:53140 192.168.126.139:50441 ESTABLISHED 10035/filebeat |
OS : Windows Server 2012 R2
Winlogbeat Install & Configuration & Start
PS C:\> $WebClient = New-Object System.Net.WebClient PS C:\> $WebClient.DownloadFile("https://artifacts.elastic.co/downloads/beats/winlogbeat/winlogbeat-6.6.0-windows-x86_64.zip","C:\opt\winlogbeat.zip") # Winlogbeat 압축 해제 후 winlogbeat.yml 파일 편집 ... #======================= Winlogbeat specific options ========================== # event_logs specifies a list of event logs to monitor as well as any # accompanying options. The YAML data type of event_logs is a list of # dictionaries. # # The supported keys are name (required), tags, fields, fields_under_root, # forwarded, ignore_older, level, event_id, provider, and include_xml. Please # visit the documentation for the complete details of each option. # https://go.es.io/WinlogbeatConfig winlogbeat.event_logs: - name: Application ignore_older: 72h - name: Security - name: System - name: Microsoft-Windows-Sysmon/Operational ... #================================ Outputs ===================================== # Configure what output to use when sending the data collected by the beat. #-------------------------- Elasticsearch output ------------------------------ #output.elasticsearch: # Array of hosts to connect to. #hosts: ["localhost:9200"] # Enabled ilm (beta) to use index lifecycle management instead daily indices. #ilm.enabled: false # Optional protocol and basic auth credentials. #protocol: "https" #username: "elastic" #password: "changeme" #----------------------------- Logstash output -------------------------------- output.logstash: # The Logstash hosts hosts: ["192.168.126.139:5044"] # Optional SSL. By default is off. # List of root certificates for HTTPS server verifications #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"] # Certificate for SSL client authentication #ssl.certificate: "/etc/pki/client/cert.pem" # Client Certificate Key #ssl.key: "/etc/pki/client/cert.key" ... # 저장 후 종료 # Winlogbeat 서비스 설치 PS C:\opt\winlogbeat\winlogbeat-6.6.0-windows-x86_64> .\install-service-winlogbeat.ps1 __GENUS : 2 __CLASS : __PARAMETERS __SUPERCLASS : __DYNASTY : __PARAMETERS __RELPATH : __PROPERTY_COUNT : 1 __DERIVATION : {} __SERVER : __NAMESPACE : __PATH : ReturnValue : 5 PSComputerName : __GENUS : 2 __CLASS : __PARAMETERS __SUPERCLASS : __DYNASTY : __PARAMETERS __RELPATH : __PROPERTY_COUNT : 1 __DERIVATION : {} __SERVER : __NAMESPACE : __PATH : ReturnValue : 0 PSComputerName : Status : Stopped Name : winlogbeat DisplayName : winlogbeat # Winldowsbeat 서비스 시작 PS C:\opt\winlogbeat\winlogbeat-6.6.0-windows-x86_64> Start-Service winlogbeat |
Winlogbeat Service Start Check
PS C:\> netstat -an |findstr 5044 TCP 192.168.126.138:49270 192.168.126.139:5044 ESTABLISHED |
ElasticSearch & Kibana 확인
ElasticSearch Input Data Check
http://192.168.126.141:9200/_cat/indices
- ElasticSearch에 데이터가 쌓이는 부분 확인
Discover 클릭 후 Visualizations할 Indices Data 이름 입력 후 Next Step 클릭
- filebeat-*
필터 할 필드 이름 선택 후 Create Index Pattern 버튼 클릭
Filebeat LogData를 Visualizations하여 확인
'[Server Story] > Management' 카테고리의 다른 글
[Zabbix] Zabbix Agent Install - CentOS7 (0) | 2021.07.27 |
---|---|
[Zabbix] Zabbix Server Install - CentOS7 & MySQL & Nginx (0) | 2021.07.27 |
[Ansible] Facts란 무엇인가? (0) | 2018.12.29 |
[Ansible] Authorized_keys 등록하기(SSH Key) (0) | 2018.12.26 |
[Ansible] Known_hosts 등록하기 (0) | 2018.12.24 |