Import of large MySQL Data Volume and MySQL Data Volume
First: in fact, the best way is to directly use:
Mysqldump-u username-p Password Database Name <database name. SQL
In linux, we tested a data import of over 10 thousand rows, totaling 121 MB. In linux, the import is successful within several seconds. It is difficult to import MB of data in phpmyadmin or in various terminals. You may encounter various problems, such as browser freezing and indexing of temporary files is too large, and restrictions on SQL statements configured in php. I recommend using the above method for import.
If the table name is used, it is:
Mysqldump-uroot-p Password Database Name table name <biao. SQL
Second, use phpmyadmin
Modify the php environment configuration and increase it.
1) first, we need to find the php configuration file. INI file, press Ctrl + F to search for post_max_size. If you use your own php configuration, the default value should be 8 Mb. Many integration environments have not changed this parameter, if the file to be imported is smaller than 8 Mb, you do not need to modify this parameter. if the value is greater than 8 Mb, the value is modified as needed.
2) After configuring post_max_size, we will search for upload_max_filesize. If you are using your own environment, the default value is 2 MB. If you are using an integrated environment, it may be 8 MB, and then change the size you want to import (the maximum number of imported files is the minimum value in post_max_size and upload_max_filesize)
In this way, you can. However, the effect is much worse than the first method. The first type is recommended.