Server Environment
PHP Environment
What happens: When uploading a 30M file, local data has been submitted, and Serverio has been observed to have downloaded traffic.
But there is no end. PHP output $_files, and will not run to this line of code. For example, I submit a 30M file form submission. However, the traffic is expected to exceed hundreds of M and will not stop. No man stops the pace anyway, it is constantly interacting with the data. Does the elder brother encounter this situation? Are there any PHP configurations that are not configured to
Large file upload problem resolved with my test upload 32M file size as an example
PHP execution mode fpm-fcgi
Ini
upload_max_filesize = 64m// maximum upload file limit
Post_max_size = 100M//Post method Submission maximum limit is better than above
Max_execution_time = 300// script time-out
Max_input_time = 300
Memory_limit = 200M//assumptions are not large enough. can cause memory overflow 500 error
Nginx.conf
Client_max_body_size 100m; Adjust submission space yourself based on file size
The above resolution method.
Baidu find out their own to share under. Is there any big ability to upload a large file with a System configuration optimization example hope pointing twos. Thank you!
!
Copyright notice: This article blog original articles, blogs, without consent, may not be reproduced.
Encounter strange questions, help the mighty answer, submit the form submitted by the file with unlimited data questions