Download the entire website recursively using wget (essential for the website)
If you have time to see the beautiful pages of other websites, you just want to take a look and learn. Share a common webshell command wget
This command can download the entire site recursively and convert the link on the downloaded page to a local link.
After adding parameters to wget, it can become a powerful download tool.
Wget command details
Http://xxx.com/xxx of wget-r-p-np-k
-R, -- recursive (recursive) specify recursive download. (specify recursive download)
-K, -- convert-links (Conversion link) make links in downloaded HTML point to local files. (convert the link on the downloaded HTML page to a relative link, that is, a local link)
-P, -- page-requisites (page essential element) get all images, etc. needed to display HTML page. (download all images and other pages to display the required content)
-Np, -- no-parent (not traced back to the parent level) don't ascend to the parent directory.
In addition, the-o parameter of the-nc parameter log is used for resumable data transfer.
Give me a try on my website.
Run wget-r-p-np-k http://www.phpernote.com/command
After recursive download is completed, you will find that your current directory has a directory www.phpernote.com /.
Go to this directory and check
Be familiar with wget commands to help your website.