Php reader library generation xml file solution

Source: Internet
Author: User
Tags php reader
Php Reader generates xml files. The library contains more than 60 million records, and php + mysql is used to generate xml files. Now we have prepared 1000 records for each xml page, which is to generate more than 60 pages of xml files. Now the problem is: every time the xml file is generated to 13 pages, that is, when reading 134000 entries in the database, the page reports an error: Fatalerror: Allowedmemorysizeof134217728b php reader library to generate an xml file.
In other words, there are more than 60 million records in the library, and php + mysql is used to read the library cyclically to generate xml files.
Now we have prepared 1000 records for each xml page, which is to generate more than 60 pages of xml files.

Now the problem is: every time an xml file is generated to 13 pages, that is, when reading 134000 entries in the database, the page reports an error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 80 bytes)

Please help me see what's going on? What should I do? How to make it generate all pages smoothly
Wish you a safe life!


------ Solution --------------------
The allocated memory is not enough. modify php. ini to find memory_limit = 128 M and increase the memory size.
------ Solution --------------------
The memory limit set by memory_limit in the php. ini configuration file is exceeded (default value: 128 MB, 134217728 bytes ). There are enough 128 MB of data. it is estimated that your code is faulty. every 1000 records are saved to a variable in a loop? You can save every 1000 variables read to a PHP variable, write the xml file, read 1000 variables, save them to the same variable, and then write the xml file...
------ Solution --------------------
If the memory limit is exceeded, try to increase the memory limit in php. ini.
It is better to increase the number of READ cycles, and the unset () mysql_free_result () at the end of each loop releases unnecessary memory.
------ Solution --------------------
Send the key code for a look.
------ Solution --------------------
Discussion
Are you sure you want to change the server configuration file?

------ Solution --------------------
Discussion

If the memory limit is exceeded, try to increase the memory limit in php. ini.
It is better to increase the number of READ cycles, and the unset () mysql_free_result () at the end of each loop releases unnecessary memory.

------ Solution --------------------
The memory is insufficient,
The method is:
1. increase the memory ini_set ('memory _ limit ','??? M ');
2. reduce the memory used by the program. check your code.
------ Solution --------------------
If I'm not correct, it should be more than 600 pages.
Use mysql_free_result () to release the result set instead of the mysql connection. release the result set once on 10 pages. if you do not believe this data, you cannot fix it.
------ Solution --------------------
As long as the data itself does not have a problem, batch reading and writing should be fine. if the server has restrictions and cannot modify the parameters, you can download the data back for local processing.
Discussion

Reference:
If I'm not correct, it should be more than 600 pages.
Use mysql_free_result () to release the result set instead of the mysql connection. release the result set once on 10 pages. if you do not believe this data, you cannot fix it.
Eldest brother, you are talking about more than 600 pages, but are you sure you can fix it?

------ Solution --------------------
Discussion
Now the problem is: every time an xml file is generated to 13 pages, that is, when reading 13000 entries in the database, the page reports an error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to ......

------ Solution --------------------
Paste the code and you will know.
------ Solution --------------------
In another possible case, the average length of records in the result set returned by your database query is relatively large (for example, the average total length of records exceeds KB ). You can try to replace mysql_query with mysql_unbuffered_query.
------ Solution --------------------
Discussion
504 Gateway Time-out

--------------------------------------------

Nginx/0.6.35

------ Solution --------------------
504 Gateway Time-out is timeout
Nginx times out instead of php times out. Because you have never had the echo

DOMDocument memory consumption
Not suitable for your application
You can paste the xml structure to see if it can be assembled in SQL.

------ Solution --------------------
About 504 Gateway Time-out can you look at these http://www.google.com.hk/search? Client = aff-cs-360chromium & ie = UTF-8 & q = 504 + Gateway + Time-out
There is no need to argue with you

Since the server has no permissions, it doesn't matter if the command line is executed.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.