User:Tsu2/Install and Intro Logstash-Elasticsearch-Kibana
Logstash - Elasticsearch - Kibana
This page is for anyone who is interested in Big Data Analysis, primarily those who are looking to implement some type of MapReduce. This solution specifically sets up a solution that can immediately aggregate logfiles of all types across an Enterprise. Grok is used to automatically parse the inbcoming logfiles according to standard known formats and builds indexes to quickly search the data. Kibana is one of the most capable web interfaces which enables search queries to be built quickly and displayed in a simple but powerful graphical format.
This page is not intended to be comprehensive, far from it.
The main objective is to enable the User to setup a working system on a single openSUSE node (can be real or virtual), then run the quick tests described in the original logstash and kibana websites. The User will see immediate results.
The original tutorial/tests have been modified:
All required files are downloaded before any testing instead of piecemeal per test.
All config and data files for every test reside in the same working directory..
Logstash is installed into /opt/logstash. If the User prefers, the original commands from the logstash website can be used which assume that the logstash jar file is also in the working directory..
Based on the "10 minute tutorial" at http://logstash.net/docs/1.1.13/tutorials/10-minute-walkthrough/
Follow my guides for installing
After logstash, elasticsearch and kibana are installed
logstash installed in /opt/logstash
elasticsearch installed and running as a service
kibana installed in ~/Kibana
Tests will be run from a directory of your choice which means that both config files and data are in the same directory
Download config and Data files from logstash
If you don't intend to run your tests from the Downloads directory(which is fine), move them to your directory of choice.
Optional, But Recommended
- If you're running this is a Guest VM using virtualization, you may want to setup a shared file system so that you can create and transfer files between your Host and Guest. If you're using KVM or Xen, I recommend following my libvirt VM Manager setup guide
- If you think you may be executing the Test commands often, you may want to convert them to executable scripts, eg
echo "java -jar logstash-1.1.13-flatjar.jar agent -f hello.conf" > Run_HelloWorld_Test.sh && chmod +x Run_HelloWorld_Test.sh
Test a Simple "Hello World"
Objective: View how data is reformatted to the logstash format
Open 2 consoles.
java -jar logstash-1.1.13-flatjar.jar agent -f hello.conf
Type something random
Test logstash passed to Elasticsearch
(Frankly I couldn't get it to work)
Open 2 consoles
java -jar logstash-1.1.13-flatjar.jar agent -f hello-search.conf
The logstash web interface
Objective: Use the built in minimal Logstash web interface for submitting queries
Open 2 consoles plus a web browser
java -jar logstash-1.1.13-flatjar.jar agent -f hello-search.conf -- web --backend 'elasticsearch://localhost/'
Console 2 Type some randome text Web Browser Open to localhost:9292, search for the text you typed in
Test - Searching real log data (single entry logfile first)
Objective: This first data file test only inspects a single line entry
Open 2 consoles
java -jar logstash-1.1.13-flatjar.jar agent -f apache-parse.conf
Input the data manually
nc localhost 3333 < apache_log.1
View in a web browser localhost:9292
Searching a larger data file
Objective: This is closer to "Real World. Inspect a larger apache logfile across multiple days. Explore querying and drilling into the results
Open 3 consoles
java -jar logstash-1.1.13-flatjar.jar agent -f apache-elasticsearch.conf -- web --backend 'elasticsearch://localhost/'
bzip2 -d apache_log.2.bz2
nc localhost 3333 < apache_log.2
Open a web browser localhost:9292
with above still running,
open a console
Open a web browser localhost:5601