Only one copy is retained for linuxshell to delete duplicate files.
Source: Internet
Author: User
! Binbashname: remove_onesh purpose: find and delete duplicate files. only one sample is retained for each file #! /Bin/bash
# Name: remove_one.sh
# Purpose: find and delete duplicate files. only one sample is retained for each file.
# Sort and output files by size
Ls-lS | awk 'In in {
# Obtain the total number of the first row and discard it. Read the next row.
Getline;
Name1 = $9; size = $5;
}
{
Name2 = $9;
If (size = $5)
# Files of the same size may have the same content
{
# Md5 checksum
("Md5sum" name1) | getline; csum1 = $1;
("Md5sum" name2) | getline; csum2 = $1;
# If the checksum is the same, it is the same text set, and the output name
If (csum1 = csum2)
{
{Print name1; print name2}
}
};
Size = $5; name1 = name2;
} '| Sort-u> duplicate_files
# Calculate the md5sum of the duplicate file and write a sample from the duplicate file to duplicate_sample.
Cat duplicate_files | xargs-I {} md5sum {} | sort | uniq-w 32 | awk '{print $2}' | sort-u> duplicate_sample
Echo Removing...
# Delete all files listed in duplicate_files and not listed in duplicate_sample
Comm duplicate_files duplicate_sample-2-3 | tee/dev/stderr | xargs rm
Echo Removed duplicates files successfully
--------------------------------------------------------
Run:
[Root @ node1 tmp] # sh remove_one.sh
Filters the subdirectories in the current directory. The subdirectories are not processed recursively.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.