When the amount of data surges, we will choose the library table hash and so on to optimize the data read and write speed, for example:
100 million data, divided into 100 sheets
1. Create 100 Tables First
$i = 0;
while ($i <=99) {
echo "$newNumber \ r \ n";
$sql = "CREATE TABLE ' Code_". $i. " ` (
' Full_code ' char (TEN) is not NULL,
' Create_time ' int (ten) unsigned not NULL,
PRIMARY KEY (' Full_code '),
) Engine=myisam DEFAULT Charset=utf8 ";
mysql_query ($sql);
$i + +;
2. Sub-table rules:
Full_code as the primary key, the Full_code do hash
$table _name=get_hash_table (' Code ', $full _code);
function get_hash_table ($table, $code, $s =100) {
$hash = sprintf ("%u", CRC32 ($code));
Echo $hash;
$hash 1 = intval (Fmod ($hash, $s));
return $table. " _ ". $hash 1;
}
This gets the table name of the data stored by get_hash_table before inserting the data.
3. Use the merge storage engine to implement a complete code table
CREATE TABLE IF not EXISTS ' code ' (
' Full_code ' char (TEN) is not NULL,
' Create_time ' int (ten) unsigned not NULL,
INDEX (Full_code)
) Type=merge union= (code_0,code_1,code_2 ...) Insert_method=last;
All Full_code data can be obtained from the SELECT * from code.
MySQL Big Data sub-table after query