Welcome to the Linux community forum and interact with 2 million technical staff to test the simulated current network. You need to build a test environment and import hundreds of millions of data to the database. Make a simple record of the problem. Data: a database, 2000 tables, 2000 table data records, 30 columns per record, 86388670 records per table on average, a total of records
Welcome to the Linux community forum and interact with 2 million technical staff> to test the simulated current network, you need to build a test environment and import hundreds of millions of data to the database. Make a simple record of the problem. Data: a database, 2000 tables, 2000 table data records, 30 columns per record, 86388670 records per table on average, a total of records
Welcome to the Linux community forum and interact with 2 million technicians>
To simulate tests on the current network, you need to build a test environment and import hundreds of millions of data records to the database. Make a simple record of the problem.
Data: a database, 2000 tables, 2000 table data, 30 columns per record, 86388670 records per table on average, a total of records.
Linux 64-bit 8 GB memory 4-core IntelX3320@2.5GHz
Basic Ideas:
Mysql-utest-pxxxx-s-e "source xxxx. SQL"
Place the load statement in xxxx. SQL
Load Syntax:
Load data [low_priority] [local] infile 'file_name.txt '[replace | ignore]
If the data is imported in a serial mode, it will be slow. Consider parallel mode, but there may be conflicts. solution:
Clear the table before load
Truncate table xxxx or delete from table xxxx;
Or use the replace or ignore keyword.