Business requirements: the need to migrate 1000多万个 files from one directory to a remote machine
Idea: Use wget to move the file one after another, because the number of files is relatively large, if a bit in the loop operation, it will be very slow. So the batch operation, adopt piecemeal method.
#! /bin/shhome=/usr/local/www/skate/image63delbackcd $home if [ ' pwd ' == $home ];thena= "1 1000000 2000000 3000000 4000000 5000000 6000000 7000000 8000000 9000000 " for b in $a doc= ' expr $b + 100000 ' for loop in ' sed -n ' $b, $c ' p $1 ' dopath= ' echo $loop | awk -F "/" ' {print $1 "/" $ $ "/" $ 3 "/" $4} ' mkdir -p $path /usr/bin/wget http://172.16.111.163/$loop -P $path echo $loop >> $1.log donedonefi
Requirement 2: Need to migrate the 1000多万个 small files in the a directory to directory B, 1000, 30 minutes at a time, before the MV need to confirm the B directory is empty.
Idea: Using Python's Shutil module, but also with the shell, probably the idea with the above general.
# -*- coding: utf-8 -*- import os import shutildef test1 (RootDir) : list_dirs = os.walk (RootDir) count=0 for root, dirs, files in list_dirs: for d in dirs: print os.path.join (root, d) if os.listdir ("/data/file/s10032666/"): print "Directory is not empty, please empty the file. " return 0 else: for F in files: if count < 1000: count=count +1 f = os.path.join (root, f) shutil.move (F, "/data/file/ s10032666/") print os.path.join (root, f) else: return 0test1 ("/data/S10032666_ bak/")
This article is from the creator think blog, so be sure to keep this source http://strongit.blog.51cto.com/10020534/1812396
Shell quickly migrates massive files