The CentOS disk is full, look for large files and clean up

Source: Internet
Author: User

The CentOS disk is full, look for large files and clean upJuly 29, 2013 ⁄ general ⁄ a total of 1172 characters ⁄ font size small medium big ⁄ comments off

Today found VPS typing crontab-e actually prompted "Disk quota exceeded" cannot be edited. So "Df-h" looked up the system disk space use 100%. The last location is/var/spool/mail/root This file has several G. Then clean up and solve the problem. If you don't know which files are taking up disk space, look for the large file method below.

The following is transferred from HTTP://HI.BAIDU.COM/11HAIOU/ITEM/F3A4878B601E084E840FAB17

Linux operating systems often encounter problems with full disk space. Encounter such a problem, first check what is the file is too large or too much to cause, as for the cause of this deeper layer, first not discussed. How to view the path of large files, there is a command in Linux, called Du,
Paste First command: du-m--max-depth=1 or du-h--max-depth=1du: Used to count the size of the disk space occupied by files or directories in Linux du parameter ######-
M: Display query results in m units
-H: Display query results in K, M, G for improved readability of information
--max-depth=1: Where the number "1" refers to the maximum number of directory layers displayed in the query results, where a maximum of one level of directory is displayed. Examples are as follows:

Then, using this command, layer-by-level (directory) drill-down, so that you will be able to query to the directory of large files, but it may also be caused by too many files in the directory. When a large file is found, if for some reason you do not want to delete the file, you can clear the contents of the file with the Echo or cat or clear command:
echo >/var/log/big.log This command clears the contents of the Big.log file in the/var/log directory without deleting the Big.log file
Cat >/var/log/big.log This command works the same as echo >/var/log/big.log, but after the command executes, it needs to end with "Ctrl + D"
Clear >/var/log/big.log This command empties the contents of the Big.log file without deleting the file

==========================================================

Another situation: it may be that the inode is exhausted.


Use # Df-i to view.

Really is the inode ran out, the disk can no longer write new content, this is some temporary small files too much, the inode ran out, delete it on the line.

# Find/var/spool/clientmqueue-type f-print-exec rm-f {} \;
The/var/spool/clientmqueue in this command is the directory where the files are to be deleted-type F This parameter restricts find to find only normal files (that is, not pipeline files, not block files, not link files, not directories, etc.), preventing accidental deletion of system files.

Cause Analysis: There are users in the system to open cron, and Cron executes the program has output content, the output will be sent to the user in the mail, cron, and SendMail did not start to produce these files;

The CentOS disk is full, look for large files and clean up

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.