MySQL Quick save insert a large amount of data some method summary

Source: Internet
Author: User
Tags rand types of tables create database

Description

These days tried to use a different storage engine to insert a large number of MySQL table data, the main test MyISAM storage engine and InnoDB. The following is an experimental process:

Realize:
One, InnoDB storage engine.
Creating databases and Tables

The code is as follows Copy Code
> CREATE DATABASE ecommerce;
> CREATE TABLE Employees (
ID INT not NULL,
FName VARCHAR (30),
LName VARCHAR (30),
Birth TIMESTAMP,
Hired DATE not NULL DEFAULT ' 1970-01-01 ',
Separated DATE not NULL DEFAULT ' 9999-12-31 ',
Job_code INT not NULL,
store_id INT not NULL
)
Partition by RANGE (store_id) (
Partition P0 VALUES less THAN (10000),
Partition P1 VALUES less THAN (50000),
Partition P2 VALUES less THAN (100000),
Partition P3 VALUES less THAN (150000),
Partition P4 VALUES less THAN MAXVALUE
);

Creating a Stored Procedure

The code is as follows Copy Code
> Use ecommerce;
> delimiter//delimiter command to change the statement delimiter from; to//, or the subsequent stored procedure will be wrong. To declare var int;mysql stop
> CREATE PROCEDURE batchinsert (in init int, loop_time int)
BEGIN
DECLARE Var INT;
DECLARE ID INT;
SET Var = 0;
SET ID = init;
While Var < loop_time todo
Insert into employees (ID,FNAME,LNAME,BIRTH,HIRED,SEPARATED,JOB_CODE,STORE_ID) VALUES (Id,concat (' Chen ', id), CONCAT ( ' Haixiang ', ID, now (), now (), now (), 1,id);
SET id = id + 1;
SET var = var + 1;
End while;
End;
//
> delimiter;

Change the delimiter back;
Call stored Procedure Insert data

The code is as follows Copy Code
> Call Batchinsert (30036,200000);

Spents: 3h 37min 8sec
Second, MyISAM storage engine
Create a table

The code is as follows Copy Code
> Use ecommerce;
> CREATE TABLE ecommerce.customer (
ID INT not NULL,
Email VARCHAR not NULL,
Name VARCHAR not NULL,
Password VARCHAR () not NULL,
Phone VARCHAR (13),
Birth DATE,
Sex INT (1),
Avatar BLOB,
Address VARCHAR (64),
Regtime DATETIME,
Lastip VARCHAR (15),
Modifytime TIMESTAMP not NULL,
PRIMARY KEY (ID)
) ENGINE = MyISAM Row_format = DEFAULT
Partition by RANGE (ID) (
Partition P0 VALUES less THAN (100000),
Partition P1 VALUES less THAN (500000),
Partition P2 VALUES less THAN (1000000),
Partition P3 VALUES less THAN (1500000),
Partition P4 VALUES less THAN (2000000),
Partition P5 VALUES less THAN MAXVALUE
);

Creating a Stored Procedure

The code is as follows Copy Code
> Use ecommerce;
> DROP PROCEDURE IF EXISTS ecommerce. Batchinsertcustomer;
> Delimiter//
> CREATE PROCEDURE batchinsertcustomer (in Start int,in loop_time INT)
BEGIN
DECLARE Var INT;
DECLARE ID INT;
SET Var = 0;
SET id= start;
While Var < Loop_time
Todo
Insert into customer (Id,email,name,password,phone,birth,sex,avatar,address,regtime,lastip,modifytime)
VALUES (Id,concat (ID, ' @sina. com '), CONCAT (' Name_ ', rand (ID) *10000 mod), 123456,13800000000,adddate (' 1995-01-01 ', (Rand (ID) *36520) mod 3652), var%2, ' http:///it/u=2267714161,58787848&fm=52&gp=0.jpg ', ' Haidian District, Beijing ', Adddate (' 1995-01-01 ', (rand (ID) *36520) mod 3652), ' 8.8.8.8 ', adddate (' 1995-01-01 ', (rand (ID) *36520) mod 3652);
SET var = var + 1;
SET id= ID + 1;
End while;
End;
//
> delimiter;

Call stored Procedure Insert data

The code is as follows Copy Code
> ALTER TABLE customer DISABLE KEYS;
> Call Batchinsertcustomer (1,2000000);
> ALTER TABLE customer ENABLE KEYS;

Spents: 8min 50sec
By comparing the above, you find that you can use the MyISAM storage engine for inserting large amounts of data, and if you need to modify the MySQL storage engine, you can use the command:
ALTER TABLE t ENGINE = MYISAM;

Another file

A long time ago, in order to write a program, you must insert a large number of data in the MySQL database, a total of 85,766,121. Nearly 100 million data, how to quickly insert into MySQL?


At the time, the practice was to insert into a single insertion, the navicat estimate takes more than 10 hours to complete, and then give up. In recent days to learn about MySQL, improve data insertion efficiency of the basic principles are as follows:

» The efficiency of bulk inserting data is higher than that of single data row inserts
» Inserting index-less data tables is faster than inserting an indexed datasheet
» Shorter SQL statements Data inserts a longer statement quickly
Some of these factors may seem trivial, but if a large amount of data is inserted, even small factors that affect efficiency can produce different results. Based on the rules discussed above, we can draw several practical conclusions on how to load data quickly.

» Using the Load data statement is more efficient than an INSERT statement because it inserts data rows in bulk. The server only needs to parse and interpret a single statement (not multiple statements). The index needs to be refreshed only after all data rows have been processed, not each processing row.
» If you can only use INSERT statements, use a format that gives multiple rows of data in a single statement:


INSERT into table_name VALUES (...), (...),... This reduces the total number of statements you need and minimizes the number of index refreshes.

According to the above conclusions, today, the same data and data tables were tested, and found that the load data speed faster than a little bit, unexpectedly only more than 10 minutes! So when MySQL needs to quickly insert a large amount of data, LOAD information is your choice.

By the way, by default, the LOAD data statement assumes that the values of each data column are in tabs (t), the rows are delimited by newline characters (n), and the data values are sorted in the order in which they are listed in the data table. But you can use it to read data files in other formats or to read the values of each data column in other order, please refer to the MySQL documentation for details.


Summary

1. For MyISAM types of tables, you can quickly import large amounts of data in the following ways.

ALTER TABLE tblname DISABLE KEYS;

Loading the data

ALTER TABLE tblname ENABLE KEYS;

These commands are used to turn on or off updates to MyISAM tables that are not unique indexes. When you import a large amount of data into a non-empty MyISAM table, you can increase the efficiency of the import by setting both commands. For importing large amounts of data to an empty MyISAM table, the default is to import data before creating an index, so you don't have to set it up.

2. For InnoDB types of tables, this approach does not improve the efficiency of importing data. For InnoDB types of tables, there are several ways to increase the efficiency of the import:

A. Because InnoDB types of tables are saved in the order of the primary key, the imported data is arranged in the order of the primary key, which can effectively improve the efficiency of the imported data. If the InnoDB table does not have a primary key, the system creates an internal column as the primary key by default, so if you can create a primary key to the table, you can use this advantage to improve the efficiency of importing the data.

B. Perform set unique_checks=0 before importing data, turn off uniqueness checksum, perform set Unique_checks=1 after the import is complete, restore uniqueness checksum, and increase the efficiency of the import.
C. If the application uses autocommit, it is recommended to execute set autocommit=0 before importing, turn off autocommit, and then execute after the import is finished

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.