Mac did not find a very suitable tool to download the site page, to do template Web pages,
So we find the information, wrote a script, through the wget to download the page, as follows:
#! /bin/bash
Url= "$"
Path= "$"
echo "Download URL: $URL"
echo "Download dir: $PATH"
/USR/LOCAL/BIN/WGET-E robots=off-w 1-xq-np-pk-e-T 1-p "$PATH" "$URL"
echo "Success to download"
Note:
Robots=off here is because wget by default will be based on the robots.txt of the site to operate, if Robots.txt is user-agent: * Disallow:/, wget is not able to mirror or download the directory. The-e robots=off option can be used to bypass this restriction by this command.
-W seconds wait time between requests for resources to download (eases server pressure)
-NP downloads only the content under the given URL, not its parent content
-PK download all the resources needed for the page, including the picture and CSS styles, and convert the absolute path to a relative path (this is important in order to load the relevant resources in the local search when the user opens the page)
-e will download the file with the HTM suffix saved
-T times retry downloads after a resource download fails
-P download to which path
The above content is stored as *.sh file, and through the command chmod +x, increase the file permissions,
Then pass in the parameters to execute the file, such as:
/users/zhangtao/documents/shell Script/test.sh/users/zhangtao/documents/sites http://kedahb.com/index.asp
This command description: Executes the test.sh file, passed in the saved directory, and needs to download the paging file, the command after the completion will prompt "Success to download"
Results of the Download:
Mac Front end Copy Station tool script