Delete a large number of files under CentOS
Create 0.5 million files first
Perform test for I in $ (seq 1 500000); do echo text >>$ I .txt; done
1. rm
Please test time rm-f *
Zsh: sure you want to delete all the files in/home/hungerr/test [yn]? Y
Zsh: argument list too long: rm
Rm-f * 3.63 s user 0.29 s system 98% cpu 3.985 total
Rm does not work because there are too many files.
2. find
Please test time find./-type f-exec rm {}\;
Find./-type f-exec rm {}\; 49.86 s user 1032.13 s system 41% cpu 43: 19.17 total
About 43 minutes, my computer ...... While watching the video, you can delete it.
3. find with delete
Please test time find./-type f-delete
Find./-type f-delete 0.43 s user 11.21 s system 2% cpu. 38 total
It takes 9 minutes.
4. rsync
First, create an empty folder named "blanktest ".
Region ~ Time rsync-a -- delete blanktest/test/
Rsync-a -- delete blanktest/test/0.59 s user 7.86 s system 51% cpu 16.418 total
16 s, very good and powerful.
5. Python
Import OS
Import timeit
Def main ():
For pathname, dirnames, filenames in OS. walk ('/home/username/test '):
For filename in filenames:
File = OS. path. join (pathname, filename)
OS. remove (file)
If _ name __= = '_ main __':
T = timeit. Timer ('main () ', 'From _ main _ import main ')
Print t. timeit (1)
1
2
Region ~ Python test. py
529.309022903
It takes about 9 minutes.
6. Perl
Using test time perl-e 'for (<*>) {(stat) [9] <(unlink ))}'
Perl-e 'for (<*>) {(stat) [9] <(unlink)} '2017 s user 1.28 s system 7.23 cpu 50% total
16 s, this should be the fastest.
Statistics:
Time-consuming command
Too many rm files are unavailable
Find with-exec 0.5 million file takes 43 minutes
Find with-delete 9 minutes
Perl 16 s
Python 9 minutes
Rsync with-delete 16 s