Elasticsearch sync MySQL
Start with the project requirements: Add search functionality to the information module
This search function I use Elasticsearch realization, function just finished, so write this blog to make a record, let oneself in record under the whole step and process of some attention matters.
First, install Elasticsearch and visualization tools
For the entire tutorial reference: Mac installation Elasticsearch and visualizer
1, installation Elasticsearch
Website Address: official website
2. Install Elasticsearch-head (visual interface)
Installation Address: Https://github.com/mobz/elasticsearch-head
3. Install node. js
Installation Address: Install node. js under Mac
4, GRUNT-CLI (3, 4 mainly with 2 to achieve visual interface)
command : sudo npm install-g grunt-cli (my is installed on Mac, so others may not be applicable)
Run at Terminal: Grunt--version (Success QQ map)
5, Elasticsearch and Elasticsearch-head integration
Modify the Elasticsearch.yml file to include at the end of the document
true Http.cors.allow " * "
View results: Input: localhost:9100
This shows that the entire installation has been successful and the connection is successful, and green represents a healthy
Second, install Logstash and synchronize MySQL database
Related Blog recommendations: Install Logstash and synchronize MySQL database
1. Download Logstash
Note: The downloaded version will match the version number of your elasticsearch, my version elasticsearch6.3.2
2, Configuration Logstash-jdbc-input
It is said that more than 2.x will not be configured, but I still configure the
3. Add Mysql-connector Drive Jar Package
Put this jar package into Logstash: Mysql-connector-java-5.1.21.jar
4. It is important to add configuration files (for connecting Elasticsearch and MySQL database)!
Specific explanations recommended blog: Logstash input JDBC Connection database
input {stdin {} jdbc {type="News_info" #subsequent test corresponds to the test database in MySQLJdbc_connection_string ="jdbc:mysql://127.0.0.1:3306/news"Jdbc_user="Root"Jdbc_password="Root"Tracking_column="auto_id"Record_last_run="true"Use_column_value="true" #represents the location where the value of the last data record ID is stored, and it automatically creates the news in the bin directory, which is required or an error is initiatedLast_run_metadata_path ="News"Clean_run="false" #This represents Mysql-connector-java-5.1.21.jar placed in the bin directoryJdbc_driver_library ="Mysql-connector-java-5.1.21.jar" #The name of the driver class for MySQLJdbc_driver_class ="Java::com.mysql.jdbc.driver"jdbc_paging_enabled="true"jdbc_page_size=" -"Statement="Select Auto_id,title,content,up_count,down_count,origin_create_time,grade from T_live_news_origin where auto_id >: Sql_last_value and similar_score>0.5"#the meaning of each field in a timed field (left to right), hour, day, month, year, all * default meaning is updated every minuteSchedule ="* * * * *"#Set ES index type} JDBC {type="Press_info" #MySQL JDBC connection string to our backup databse behind the test database in MySQLJdbc_connection_string ="jdbc:mysql:////127.0.0.1:3306/news"Jdbc_user="Root"Jdbc_password="Root"Tracking_column="auto_id"Record_last_run="true"Use_column_value="true"Last_run_metadata_path="News"Clean_run="false"jdbc_driver_library="Mysql-connector-java-5.1.21.jar"Jdbc_driver_class="Java::com.mysql.jdbc.driver"jdbc_paging_enabled="true"jdbc_page_size=" -"Statement="Select Auto_id,title,source_mc,read_count,summary,summary_img,origin_create_time from T_live_press_origin where auto_id >: Sql_last_value"#the meaning of each field in a timed field (left to right), hour, day, month, year, all * default meaning is updated every minuteSchedule ="* * * * *"}}filter {mutate {convert= ["Publish_time","string"]}date {timezone="Europe/berlin"Match= ["Publish_time","ISO8601","YYYY-MM-DD HH:mm:ss"]}#Date { #match = ["Publish_time", "Yyyy-mm-dd hh:mm:ss,sss"] #Remove_field = ["Publish_time"] # }JSON {Source="message"Remove_field= ["message"]}}output {if[type]=="News_info"{elasticsearch {#Esip address and Porthosts ="127.0.0.1:9200"#ES index name (self-defined)index ="Wantu_news_info"#self-increment ID numberdocument_id ="%{auto_id}" }}if[type]=="Press_info"{elasticsearch {#Esip address and Porthosts ="127.0.0.1:9200"#ES index name (self-defined)index ="Wantu_press_info"#self-increment ID numberdocument_id ="%{auto_id}" }}}
mysql.yml5. Start Logstash
# my mysql.yml here have been placed in the upper directory of the bin ./logstash-f. /mysql.yml
6. Actual effect
The connection was successful, and the data in the MySQL database table was successfully stored in Elasticsearch, and logstash every minute to go to the database to read the latest data.
Finally, take a look at my Logstash file storage location
Iii. Summary of the pits and precautions
1, the following error indicates that you did not find your Mysql-connectorjar package, it is likely that your jar is not placed in the configuration file specified directory.
2, need to re-enable the query starting from 0.
Delete the news file Last_run_metadata_path = "News" and remember to delete the index so it can reread the data in the database table.
Related Other pits blog address: The summary of the pit
Again encountered other related pits, Baidu Bar, can quickly find the answer.
I just occasionally calm down and ponder over all the past. It's not worth condemning those old times that have been naïve and dull. After all, the days ahead are still long. Keep encouraging yourself,
The day is bright, is a new beginning, but also the Unknown Journey (Colonel 4)
"ElasticSearch"---ElasticSearch sync mysql