What is splunk?
Cloud-oriented log search engine is a powerful log management software. You can add local or remote logs in multiple ways and generate graphical reports. The most powerful feature is its search function-so it is called "Google for it ".
Features:
1. Supports multi-platform Installation
2. Any data is obtained by any source.
3. Get the forwarded data from the remote system
4. Associate Complex events to generate a Visual View
5. Dedicated Big Data Engine, supporting Big Data Retrieval
6. Data Center Expansion
7. Role-Based Security Control
Common functional views:
1. splunk indexes any data
2. Search and review
3. Associate events
4. detection and alarm
5. Report Generation
6. Custom dashboard
Web interface default port: 8000
Installation and deployment
/OPT -- it is recommended that the/OPT directory be partitioned separately or the/partition should be large enough.
Considerations: Time Synchronization
Memory: preferably 1 GB
I. installation and configuration
Software Download:
Http://zh-hans.splunk.com/download
1. Installation
# Rpm-IVH splunk-5.0.2-149561-linux-2.6-x86_64.rpm
-------------------------------------------------------------------------
Splunk has been installed in:
/Opt/splunk
To start splunk, run the command:
/Opt/splunk/bin/splunk start
To use the splunk web interface, point your browser:
Http://localhost.localdomain: 8000
Complete documentation is at http://docs.splunk.com/Documentation/Splunk
-------------------------------------------------------------------------
2. Start the service
#/Opt/splunk/bin/splunk start
// There will be a license agreement for the first start. Press Q and then enter y.
3. Set auto-start upon startup
#/Opt/splunk/bin/splunk enable boot-start
Init script installed at/etc/init. d/splunk.
Init script is configured to run at boot.
4. View help
#/Opt/splunk/bin/splunk help enable
You can use/etc/init. d/splunk to manage the start and stop of splunk.
#/Etc/init. d/splunk status
5. Access the web page
Firefox http: // 172.16.254.239: 8000 &
The first access will prompt the user name and password admin/changeme
After logging on, set a new password
6. modify the configuration file of the default port
# Vim/opt/splunk/etc/system/default/Web. conf
Httpport = 8000
Splunk log import
The data source can be local, remote, Linux, UNIX, windows, vswitch, or vro. It can also be a Web server, an IIS server, or an FTP server.
1. Import local log messages to splunk
Add data --> syslog --> use any syslog file or directory on the splunk server --> path of the file on the server/var/log/messages --> default --> continue by default --> default, save
2. Search for desired logs
Start searching --> enter aborting
Search for other hosts = "localhost"
Exercise: import the secure log to splunk and verify its search function
3. Import remote logs to splunk
Environment:
Splunk server: 172.16.254.239
Remote Apache: 172.16.254.200 web.up.com
Start Configuration:
Note: To import remote logs, you must install splunkforwarder on the remote host ).
1. Set the splunk server to allow receiving data sent by the splunk Forwarder
Manager --> forward and receive --> Configure "add" in the receive --> 9999 (specify the port for receiving data) --> Save
2. Configure on the remote server
1) install the splunkforwarder package
# Rpm-IVH splunkforwarder-5.0.2-149561-linux-2.6-x86_64.rpm
2) Configure splunkforwarder to forward Apache logs
#/Opt/splunkforwarder/bin/splunk start
#/Opt/splunkforwarder/bin/splunk add forward-server 172.16.254.239: 9999
Splunk Username: Admin
Password: changeme
Added forwarding to: 172.16.254.239: 9999.
# Cd/opt/splunkforwarder/etc/system/local/
# Vim inputs. conf
[Default]
Host = web.up.com
[Monitor: // var/log/httpd]
Sourcetype = access_common
3) restart the splunk forwarder service.
#/Opt/splunkforwarder/bin/splunk restart
3. Verification results
If you can see the Apache log of web.up.com on the splunk Server web page, it will be successful.
Simple search:
Host = "web.up.com"
Host = "web.up.com" error
Host = "web.up.com" or error // or
Generate View
Import Test Data
Place sampledata.zip on the splunk Server
Continue with browser operations
Manager --> data import --> file and directory "add" --> skip preview --> upload and index file --> Save
I. Use splunk's powerful search functions
Sourcetype = "access_combined_wcookie"
1. log requests whose source type is access_combined_wcookie and whose IP address is 10.2.1.44
-- 81
Sourcetype = "access_combined_wcookie" 10.2.1.44
2. Search for 63 requests for "purchase"
Sourcetype = "access_combined_wcookie" 10.2.1.44 purchase
3. Further search for purchase requests whose HTTP return code is not 200
Sourcetype = "access_combined_wcookie" 10.2.1.44 purchase not 200
4. Further search for requests that are not 404 error
Sourcetype = "access_combined_wcookie" 10.2.1.44 purchase not 200 not 404
Search by keyword
Time spent searching
Sourcetype = "access_combined_wcookie" 10.2.1.44 action = purchase category_id = flowers
|: The previous search result is used as the input of the subsequent command.
1. Search for the best-selling products using the TOP Command
Sourcetype = "access_combined_wcookie" Action = purchase | top category_id
2. How many different customers have bought flowers?
Use stats commands and DC Functions
Sourcetype = "access_combined_wcookie" Action = purchase category_id = flowers | stats DC (clientip)
3. How many flowers does each customer buy?
Sourcetype = "access_combined_wcookie" Action = purchase category_id = flowers | stats count by clientip
Use the stats command and count function and use the by clause
4. How many flowers does each customer buy in reverse order?
Sourcetype = "access_combined_wcookie" Action = purchase category_id = flowers | stats count by clientip | sort-count
Sort-count: sort by Count field in reverse order
Sort count: sort by forward
5. Alias the data column
Sourcetype = "access_combined_wcookie" Action = purchase category_id = flowers | stats count by clientip | sort count | rename clientip as "customer", count as "Total number of flowers bought"
2. Save the search result as a dashboard
1. Create an empty dashboard
Dashboard and view --> Create a dashboard --> ID: 001 name: Total number of flowers bought by each customer --> Create
2. Save the search result to the dashboard.
Sourcetype = "access_combined_wcookie" Action = purchase category_id = flowers | stats count by clientip | sort-count | rename clientip as "customer", count as "Total number of flowers bought"
Search --> Create --> dashboard --> total spending per customer --> existing dashboard --> select the required dashboard from the drop-down list --> default
3. dynamically update the generated chart when collecting results
Create report
Search: sourcetype = "access_combined_wcookie" Action = purchase category_id = flowers | stats count by clientip | sort-count
Create --> report
This article is from the "Linux" blog, please be sure to keep this source http://3927416.blog.51cto.com/3917416/1536550