Just installed Ubuntu, after installing MySQL, you want to use the Load Data infile command import into the table, resulting in the following error "error (HY000): File '/home/bioinformation/downloads/data.txt ' Not Found (errcode:13) ',
and checked the load data syntax.
LOAD DATA [Low_priority | CONCURRENT] [local] INFILE ' File_name.txt '
[REPLACE | IGNORE]
Into TABLE tbl_name
[FIELDS
[Terminated by ' string ']
[[optionally] enclosed by ' char ']
[Escaped by ' char ']
]
[LINES
[Starting by ' string ']
[Terminated by ' string ']
]
[IGNORE number LINES]
[(Col_name_or_user_var,...)]
[SET col_name = expr,...]]
The LOAD DATA INFILE statement reads rows from a text file to a table at a very high speed. The file name must be a literal string.
Found that there is no problem, then I will be the authority to start the MySQL user to "root" or not, and then online check, or unintelligible, or think the method is too complex, but think, this error means that the file can not be found, That copy of this file to the MySQL database is located in the file does not, in the installation of MySQL did not notice, do not know where the MySQL installation, so want to find their own, rather let it tell me, but the load with a relative path
The code is as follows |
Copy Code |
"Load Data infile ' data.txt ' into table table_name", |
Sure enough, the MySQL error shows the "Can" t get stat of '/var/lib/mysql/test/data.txt ' (errcode:2), but with "/home/bioinformation/downloads/ Data.txt "Copy as"/var/lib/mysql/test/data.txt, the result discovers the user does not have authority, decisive use "sudo su" switch to root user, OK, then run
The code is as follows |
Copy Code |
"Load Data infile ' Data.txt" |
, sure enough, show "Query OK,......", and then switch users to the normal user on the line, is summarized as follows, is very simple, first use the root user copy to the MySQL database folder, using the relative path "load data infile ' data.txt ' into Table table_name "on it.
How to improve the speed of MySQL load data infile
Test Data 2.5G, a total of 9,427,567 data. The configuration of the MySQL large server is used.
The load takes about 10 minutes at a time.
The table was built with MyISAM, adjusting the parameter values of several sessions
The code is as follows |
Copy Code |
SET session bulk_insert_buffer_size=256217728; SET session myisam_sort_buffer_size=256217728; |
The results of the operation are as follows
The code is as follows |
Copy Code |
Query OK, 9427567 rows affected, 1558 warnings (3 min 55.21 sec) records:9427567 deleted:0 skipped:0 warnings:0
|
Google is also able to
The code is as follows |
Copy Code |
Set global key_buffer_size=256217728; ALTER TABLE tablename disable keys; |
How to load different encoded data
The original use of 4. The MySQL of x, I am SELECT INTO outfile, only after using Iconv, or other software to do. may refer to here, but because this data is big, uses ultraedit and so on software to open all to be half a day. Fortunately, the new version of MySQL can be added to an additional parameter
CHARACTER SET GBK
My text data is GBK encoding, the data table is UTF8, test success in this way.
When to load, insert only specific columns
For example, there are columns in the table that are more than meta data. You can specify the name of the field to insert at load time.
The code for the example is as follows:
The code is as follows |
Copy Code |
LOAD DATA INFILE ' ~/data.txt ' Into TABLE fb0505 CHARACTER SET GBK FIELDS terminated by ' t ' enclosed by ' escaped by ' LINES terminated by ' n ' starting by ' (Seller_id,fb_type,fb_date,item_url); |
One of the table fb0505 has a list of IDs.