Php performance optimization tips for importing large amounts of data to mysql, mysql Performance Optimization _ PHP Tutorial

Source: Internet
Author: User
Php imports a large amount of data to mysql Performance optimization skills, mysql Performance Optimization. Php performance optimization tips for importing a large amount of data to mysql. mysql Performance Optimization This article describes how to import a large amount of data to mysql Performance Optimization techniques in php. Share it with you for your reference. Detailed analysis of php performance optimization techniques for importing large amounts of data to mysql, mysql Performance Optimization

This article describes how to optimize the performance of importing large amounts of data to mysql using php. Share it with you for your reference. The specific analysis is as follows:

In mysql, we use php to import some files to mysql. here we will share with you how to analyze and optimize the import of 15000 records. For more information, see.

I have mentioned several articles that tiandi recently helped a friend develop a small project for counting phone numbers. each time I randomly generate a package phone number from the database as needed, then people keep making these phone numbers to promote the product (a little bit despise such behavior ). But if my friend asks for help, you can't help me either, right. The program was completed two weeks ago and handed in the test. A friend called a few days ago and said that it takes longer and longer to import phone numbers every day. sometimes it takes more than half an hour to import 10 thousand records to see if you can improve the speed.

I took a look at the concept. the database structure is very simple. I can think of two fields, one for phone numbers, and the other for Class c, d, and e, this indicates that the call has been made, the call has not been made, the call has not been made, and so on. The entire program logic is like this.

■ Get a txt file containing the phone number

■ Import the txt file to mysql through a program

■ During the import, check whether the phone number in the txt file is the same as that in mysql. if the phone number is not repeated, insert a new record directly. if the phone number is repeated, update the phone number according to the phone number category.

Since the telephone numbers in each txt file need to be compared, the program will certainly take some time. here we will leave this reason aside, because the title of this article is to optimize the write speed, when will the program write records? According to the above logic, when matching the database, no records are found, the database will be written into the database (of course, the update operation is also calculated, but only the insert operation is discussed here ), the above logic is converted into code as follows:

The code is as follows:

// $ Array is the array from the txt file explode, each of which is a telephone number, and $ str is of the type
For ($ I = 0; $ I {
$ Tmpstr = "'". $ array [$ I]. "', '". $ str ."'";
$ SQL = "INSERT INTO". $ usertable. "(tel, type) VALUES (". $ tmpstr .")";
Mysql_query ($ SQL );
}


The above code is completely correct, but the efficiency is low. when the txt file contains tens of thousands of phone numbers, tens of thousands of database insertion operations will occur. although each database write operation is fast, however, if tens of thousands of records are accumulated, the execution time cannot be ignored. tiandi simply tested the process of inserting 150 million records, which took about 5 minutes. if we added the previous logical judgment process, this is not enough for half an hour. this is not enough. you must reduce the number of database writes, so the code above is changed to the following:

The code is as follows:

$ Sql2 = "insert into". $ usertable. "(tel, type, updatetime) VALUES ";
For ($ I = 0; $ I {
$ Tmpstr = "'". $ array [$ I]. "', '". $ str ."'";
$ Sql2. = "(". $ tmpstr ."),";
}
$ Sql2 = substr ($ sql2, 0,-1); // remove the last comma
Mysql_query ($ sql2 );


In this way, the entire write operation is only one time, greatly shortening the execution time, and 15000 records are handled in about 10 seconds. well, this article is over now, if you encounter a long time to write a large amount of data to mysql, try the optimization method in this article.

I hope this article will help you with php programming.

This article describes how to optimize the performance of importing a large amount of data to mysql using php. Share it with you for your reference. Specific analysis...

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.