MySQL writes a large amount of data bitsCN.com
1. for MyISAM tables, you can import a large amount of data quickly.
Alter table tbl_name disable keys; loading the dataALTER TABLE tbl_name enable keys;
These two commands enable or disable the update of non-unique indexes in the MyISAM table. When importing a large amount of data to a non-empty MyISAM table, you can improve the import efficiency by setting these two commands. To import a large amount of data to an empty MyISAM table, the index is created only after the data is imported first by default, so you do not need to set it.
3. for Innodb tables, this method cannot improve the efficiency of data import. Innodb tables can be imported in the following ways:
<1> because Innodb tables are saved in the order of primary keys, the imported data is arranged in the order of primary keys, which can effectively improve the efficiency of data import. If the Innodb table does not have a primary key, an internal column is created by default as the primary key. Therefore, if you can create a primary key for the table, you can use this advantage to improve the efficiency of data import.
<2> run SET UNIQUE_CHECKS = 0 before importing data, disable the uniqueness check, and run SET UNIQUE_CHECKS = 1 after the import to restore the uniqueness check, which improves the import efficiency.
<3> If the application uses the automatic submission method, we recommend that you execute set autocommit = 0 before import, disable automatic submission, and execute set autocommit = 1 after import. enable automatic submission, it can also improve the import efficiency.
BitsCN.com