Mysqlbinlog into Elasticsearch

Source: Internet
Author: User
Tags zookeeper kibana logstash

Environment Preparation:

Install elasticsearch-5.4.1.tar.gz, jdk-8u121-linux-x64.tar.gz, kibana-5.1.1-linux-x86_64.tar.gz, 10.99.35.214 on the Nginx-1.12.2.tar.gz

Install elasticsearch-5.4.1.tar.gz, jdk-8u121-linux-x64.tar.gz on 10.99.35.215, 10.99.35.216

Install mysql-5.7.17-linux-glibc2.5-x86_64.tar.gz, jdk-8u121-linux-x64.tar.gz, logstash-5.5.1.tar.gz, 10.99.35.209 on the Maxwell-1.10.7.tar.gz, kafka_2.11-0.11.0.1


209 on the installation of MySQL, through Maxwell extraction Binlog log, sent to Kafka, Logstash extract Kafka data, and passed to 214, 215, 216 composed Elasticsearch cluster, Show by Kibana and Nginx on 214


1, open Binlog

Vim/etc/my.cnf

Server-id=1

Log-bin=master (This step opens Binlog)

Binlog_format=row


2, MySQL Licensing

GRANT all on maxwell.* to ' Maxwell ' @ ' percent ' identified by ' [email protected] ';

GRANT SELECT, REPLICATION client,replication SLAVE on * * to ' Maxwell ' @ '% ';

Flushprivileges;


3, Configuration Maxwell

wget https://github.com/zendesk/maxwell/releases/download/v1.10.7/maxwell-1.10.7.tar.gz

Tar Xvfz maxwell-1.10.7.tar.gz


4. Open the Maxwell command line

Nohup Bin/maxwell--user= ' Maxwell '--password= ' [email protected] '--host= ' 10.99.35.209 '--producer=kafka-- kafka.bootstrap.servers=10.99.35.209:9092 &

Explanation: The host parameter is the console where MySQL is installed, and the last Kafka.bootstrap.servers is the node hostname and port number of the installation Kafka cluster


5, Kafka related configuration

Description (My Kafka is installed on the host named 10.99.35.209,mysql also installed on the 10.99.35.209, note that the configuration file port number in the Kafka is the same as the port number given in the command line)

wget http://mirror.bit.edu.cn/apache/kafka/0.10.2.1/kafka_2.11-0.10.2.1.tgz

MV kafka_2.11-0.10.2.1.tgz/usr/local/

cd/usr/local/

Tar Xvfz kafka_2.11-0.10.2.1.tgz

Kafka uses zookeeper, so you need to start a zookeeper server first, if not yet. You can use the handy script packaged with Kafka to get a fast but coarser single-node zookeeper instance

Nohup bin/zookeeper-server-start.sh Config/zookeeper.properties &


Kafka provides a basic configuration file in the Config directory. To ensure that Kafka can be accessed remotely, we need to modify both configurations. Open the Config/server.properties file, in a very forward position there are listeners and advertised.listeners two configuration comments, remove the two comments, and according to the current server IP modified as follows:

listeners=plaintext://:9092

advertised.listeners=plaintext://10.99.35.209:9092

The current server IP is 10.99.35.209, you need to modify the server IP to be accessible to the extranet or local area network.

Next start the Kafka service:

Nohup bin/kafka-server-start.sh Config/server.properties &


Create a topic called Maxwell for easy acceptance of data

bin/kafka-topics.sh--create--zookeeper 10.99.35.209:2181--replication-factor 1--partitions 1--topic Maxwell


Once created, you can view the topic information you have created by running the following command.

bin/kafka-topics.sh--list--zookeeper 10.99.35.209:2181


View details of Topic

bin/kafka-topics.sh--describe--zookeeper 10.99.35.209:2181--topic Maxwell


Start the producer window

bin/kafka-console-producer.sh--broker-list 10.99.35.209:9092--topic Maxwell


Start the Consumer window

bin/kafka-console-consumer.sh--zookeeper 10.99.35.209:2181--topic Maxwell--from-beginning


6, installation configuration Logstash

Modifying the Logstash configuration file

Log.level:info

Path.logs:/usr/local/logstash-5.1.1/logs


Instance:

Start Zookeeper

Bin/zookeeper-server-start.sh-daemon config/zookeeper.properties


Start Kafka

Bin/kafka-server-start.sh config/server.properties


Start Maxwell

Bin/maxwell--user= ' Maxwell '--password= ' [email protected] '--host= ' 10.99.35.209 '--producer=kafka-- kafka.bootstrap.servers=10.99.35.209:9092


The Logstash configuration is as follows:

Bin/logstash-plugin Install Logstash-filter-translate


Input {

Kafka {

Bootstrap_servers = "10.99.35.209:9092"

topics = ["Maxwell"]

Codec = json {charset = ["Iso-8859-1"]}

Consumer_threads = 5

Decorate_events = True

}

}

Filter {

Mutate {

Remove_field = ["Database", "table", "TS", "XID", "commit", "old", "Kafka"]

Rename = ["[Data][id]", "id"]

Rename = ["[Data][first_name]", "first_name"]

Rename = ["[Data][last_name]", "last_name"]

Rename = ["[Data][age]", "age"]

Rename = ["[Data][about]", "about"]

Rename = ["[Data][interests]", "interests"]

}

Translate {

field = "Type"

Destination = "Op_type"

Dictionary = [

"Insert", "index",

"Update", "Update",

"Delete", "delete"

]

}

}

Output {

Elasticsearch {

hosts = ["10.99.35.214:9200"]

index = "Megacorp"

document_id = "%{id}"

Document_type = "Employee"

Action = "%{op_type}"

Workers = 1

Flush_size = 20000

Idle_flush_time = 10

Template_overwrite = True

}

StdOut {}

}

Starting Logstash and Elasticsearch, adding data to the MySQL database, Elasticsearch will see the changes in the index in real time.


#end


Mysqlbinlog into Elasticsearch

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.