wget download the entire site can use the following command wget-r-p-k-np Http://hi.baidu.com/phps,-r means a recursive download, will download all the links, but note that, do not use this parameter alone, Because if you want to download the site also has links to other sites, wget will also be other sites to download, due to the characteristics of the Internet, it is likely that you will download the entire Internet--, so to add-np this parameter, indicating that do not download links to other sites. -K means that the links in the downloaded Web page are modified to be local links.-P gets so the elements needed to display the page, more than slices of something.
There are also other parameters that can be used:
-C means that the breakpoint continues to pass
-T 100 means retry 100 times,-T 0 means infinite retry
Alternatively, you can write the URL you want to download to a file, one line per URL, and wget-i download.txt with such a command.
--REJECT=AVI,RMVB indicates that the AVI,RMVB file is not downloaded,--accept=jpg,jpeg, which means that only jpg,jpeg files are downloaded.
You can create a. wgetrc file in the user directory (Windows does not seem to directly create such a file, Windows will think that there is no file name-), which is written in http-proxy = 123.456.78.9:80, and then add the parameters--proxy =on, if a password is required, add the following parameter--proxy-user=username,--proxy-passwd=password
wget download the entire site---more practical--such as grabbing Smarty's document