Encounter a problem, tomcat recently occurred several anomalies, view the log, has been reported too many open files, familiar with the students know that this is the number of users open files caused by too many,
Re-use command ls/proc/20861/fd/| Wc-l View the current Tomcat process open file number, sure enough already 4,095, this problem solving method is to increase the number of open files, simple.
But if it can be so easy to solve, I will not have to write this blog. Because I checked the number of files that the current user can open find that the maximum number of files opened is 65535, much greater than 4096.
And/etc/security/limits.conf is set to 65535.
Why the current user can open 65,535 files, and the process opened 4,095 files beyond the limit, and began to report too many open files error message it.
A review of the process details found that the process could only open 4,096 files at maximum.
At this time hurriedly online Baidu caused the cause of this problem, only found a useful information, that is to say/etc/security/limits.conf restrictions actually depends on/etc/security/limits.d/20- Nproc.conf configuration, that is, even if/etc/security/limits.conf set the maximum open number is 65535, and/etc/security/limits.d/20- Nproc.conf is 4096, the end result is the user can only open 4,096 file handle, so hurriedly looked at the next/etc/security/limits.d/20- nproc.conf configuration, sure enough is 4096, so hurriedly changed to 65535, and then restart the application, full of joy to see the process limit, the result is 4096, vomiting blood in ...
After thinking for a long time, looked for a long time Baidu, still no clue, had to write a script monitoring log, found too many open files information after restarting Tomcat (we redundant, session also did share, restart does not affect the business).
The next day to solve this problem, the feeling is Systemctl script problem, but carefully read the script and there is no problem, so the blind cat ran into the mentality of the dead mouse to see other systemctl script how to write, the results of this look, incredibly ah, incredibly to solve the problem.
The current is to see is the docket startup script, Docket is Yum installed, the script has the following two words, with my old experience to see this is the limit process open number, if I add these two sentences to the Tomcat startup script, then restart the application, see the process constraints, found that the maximum number of open files finally changed successfully.
Writing this blog is to help those who also encounter this problem, so as I do not think of the solution, delay time. If help you, clear comment, one is I want to see how many people have encountered this problem, why I went to Baidu did not have an article to indicate the process limit this problem, second, there is a comment on the power to write a technology to share it, thank you ~
limitnofile=1048576
limitnproc=1048576
CENTOS7, Process max Open file too many open files error