If you encounter a strange problem, you will be able to answer the question. the server environment for submitting unlimited data to large files through forms
Php environment
Case: when uploading a 30 m file, the local data is being submitted, and the server io has been downloading traffic. However, there is no end. php outputs $ _ FILES and will not execute this line of code. for example, if I submit a 30 M file form, but the traffic is estimated to exceed several hundred M, it will not stop, I don't think it's a stop, that is, I don't listen to data interaction. Is there a big brother who has encountered this situation? Which php configurations are not configured?
Reply to discussion (solution)
You observe changes to temporary files on the server
Receiving uploaded temporary files is a task before php code is run and cannot be noticed at the code level.
The problem of uploading large files has been solved. In this example, the size of the 32 M file uploaded by my test is used as an example.
Php run mode FPM-FCGI
Php. ini
Upload_max_filesize = 64 M // maximum file upload limit
Post_max_size = 100 M // The maximum submission limit of the POST method is better than the above
Max_execution_time = 300 // script timeout
Max_input_time= 300
Memory_limit = 200 M // if not large enough. Memory overflow error 500
Nginx. conf
Client_max_body_size 100 m; // Adjust the submitted space according to the file size
The above solution. I checked baidu and found out what kind of system configuration optimization solution can be used to upload large files. I hope I can give some advice. thank you !!