Use the Pt-fifo-split tool to insert massive amounts of data into MySQL

Source: Internet
Author: User
Tags mysql insert

In "MySQL Insert/Update data" This article mentions that, using the load data infile statement, you can load data from a file directly into MySQL, but if the file is very large, you may also need to cut the file, multiple loads, in which case, You can use the Pt-fifo-split tool to split a file into multiple data blocks (chunks) to control the amount of data that is transferred to the MySQL server each time.

Pt-fifo-split can simulate cutting files and pass through pipelines to first-in, first-out queues without actually cutting files. Pt-fifo-split is part of the Percona-toolkit kit, which can be found in detail here.

Pt-fifo-split Command usage:

pt-fifo-split [options] [FILE ...]

For example: Use Pt-fifo-split to split a large file and read 1000000 lines at a time:

pt-fifo-split1000000  hugefile.txt while [-e/tmp/pt-fifo-splitdo  cat /tmp/pt-fifo-split done

Pt-fifo-split by default in/TMP to establish a FIFO file, and read the large file of data written to the FIFO file, each time the specified number of lines to the FIFO file to print an EOF character, after the completion of the read, close the FIFO file and remove, Then rebuild the FIFO file and print more rows. This will ensure that you can read the number of lines you read until the read is complete. Note This tool can only work on Unix-like operating systems.

Common options:

--fifo/tmp/pt-fifo-split, specifying the path to the FIFO file;

--offset 0, if you do not intend to start reading from the first line, you can set this parameter;

--lines 1000, number of rows per read;

--force, if the FIFO file already exists, delete it first and then recreate a FIFO file;

Here is a complete example of usage:

flat_file="/tmp/big_file.txt"Fifo_path="${flat_file}.fifo"Load_file="${flat_file}.load"chunk_size=100000# Split thefilePT-fifo-Split--force--lines ${chunk_size} ${flat_file}--fifo ${fifo_path} &# SleepTenseconds to assure ${fifo_path} exists before entering loopSleep Ten while[ -e ${fifo_path}] Do# Write Chunk to diskCat${fifo_path} >${load_file} # LOAD chunk into table MySQL--database=Test--show-Warnings-vve"load Data infile ' ${load_file} ' into table my_table;" Done

About Pt-fifo-split performance evaluation can refer to this article;

Use the Pt-fifo-split tool to insert massive amounts of data into MySQL

Related Article

E-Commerce Solutions

Leverage the same tools powering the Alibaba Ecosystem

Learn more >

Apsara Conference 2019

The Rise of Data Intelligence, September 25th - 27th, Hangzhou, China

Learn more >

Alibaba Cloud Free Trial

Learn and experience the power of Alibaba Cloud with a free trial worth $300-1200 USD

Learn more >

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.