The wget in a Linux system is a tool for downloading files, which is used at the command line. For Linux users is an essential tool, we often have to download some software or restore backup from a remote server to the local server.
The wget supports HTTP,HTTPS and FTP protocols and can use HTTP proxies. The so-called automatic download means that the wget can be executed in the background after the user exits the system. This means that you can log in to the system, start a wget download task,
Then exit the system, the wget will be executed in the background until the task is completed, and most of the other browsers need to be involved in downloading large amounts of data, which saves a lot of hassle. Wget can follow the links on the HTML page and download it to create a local version of the remote server, completely rebuilding the directory structure of the original site. This is often referred to as a "recursive download".
In the recursive download, wget follows the robot Exclusion Standard (/robots.txt). wget can switch links to local files for offline browsing while downloading. The wget is very stable, and it has a strong adaptability in the case of very narrow bandwidth and unstable networks. If the download fails because of the network, wget will keep trying until the entire file is downloaded.
If the server interrupts the download process, it will again be linked to the server to continue downloading from where it stopped. This is useful for downloading large files from servers that have limited link times. 1. Command format: wget [parameter] [URL address]2. Command function: Used to download resources from the network, no directory specified, download resources to default to the current directory. wget although powerful, but it is still relatively simple to use: 1 support the breakpoint down-pass function; This is the network Ant and flashget the biggest selling point of the year, now, wget can also use this feature, those networks are not too good users can rest assured that ; 2) support both FTP and HTTP download mode; Although most of the software can now be downloaded using HTTP, however, in some cases, you still need to use FTP to download the software, 3) Support proxy Server, for the security of high-intensity systems, generally do not expose their own systems directly on the Internet, Therefore, the support agent is necessary to download the software features; 4) easy to set up, may be accustomed to the GUI user is not too accustomed to the command line, but the command line in the settings actually have more advantages, at least, the mouse can be a few times more, do not worry about whether the mouse is wrong, 5) program is small, completely free , the program is small can be considered, because the current hard disk is too big, completely free to consider, even if there are many so-called free software, but these software ads are not our favorite. 3. Command parameters: Startup parameters:-v,–version show wget version after exiting-h,–help print grammar help-b,–background boot into background execution-e,–execute=command execute '. WGETRC ' format commands, WGETRC format see/ETC/WGETRC or ~/.WGETRC Records and input file parameters:-o,–output-file=file write the records to the file files-a,–append-output=file Append the record to the file-d,–debug print debug output-q,–quiet Quiet mode (no output)-v,–verbose verbose mode (this is the default setting)-nv,–non-verbose turn off verbose mode, but not quiet mode-i,– Input-file=file Download the file that appears in the file urls-f,–force-html treats the input files as HTML-formatted files-b,–base=url URLs as the relative links that appear in the file specified by the-f-i parameter. Sslcertfile=file Optional client certificate –sslcertkey=keyfile optional client certificate KeyfilE–EGD-FILE=FILE Specifies the file name of the EGD socket download parameter: –bind-address=address Specifies the local use address (hostname or IP, used when there are multiple IPs or names locally)-t,–tries=number Sets the maximum number of attempts to link (0 means no limit).-o–output-document=file write the document to the file-nc,–no-clobber do not overwrite the existing file or use the. #前缀-c,–continue then download the files that are not downloaded – Progress=type Set the process bar flag-n,–timestamping do not download the file again unless the response to the new-s,–server-response print server on the local file –spider not downloaded anything-t,–timeout= SECONDS set the number of seconds to response timeout-w,–wait=seconds Two attempts between SECONDS seconds –waitretry=seconds wait between Relink 1 ... Seconds seconds –random-wait between downloads wait 0 ... 2*wait sec-y,–proxy=on/off turn on or off agent-q,–quota=number set the download capacity limit –limit-rate=rate limit download output directory parameters:-nd–no-directories do not create directories-x,– Force-directories Force Create directory-nh,–no-host-directories do not create host directory-p,–directory-prefix=prefix save file to directory prefix/...–cut-dirs= Number ignores number layer remote directory http option parameter: –http-user=user set HTTP User name User.–http-passwd=pass set HTTP password to Pass-c,–cache=on/off allow/ Server-side data caching is not allowed (typically allowed)-e,–html-extension all text/html documents are saved with an. html extension –ignore-length ignore ' content-length ' header fields –header= String String–proxy-user=user the user name of the User–proxy-passwd=pass setting agent in headers to set the proxy password to Pass–refeRer=url include ' Referer:url ' header in HTTP request-s,–save-headers save HTTP header to file-u,–user-agent=agent set agent name as Agent instead of wget/version– No-http-keep-alive close the HTTP active link (forever link) –cookies=off do not use Cookies–load-cookies=file to load from the file before starting the session cookie–save-cookies= File saves the cookie to the file after the end of the session FTP option parameter:-nr,–dont-remove-listing do not remove the '. Listing ' file-g,–glob=on/off the globbing mechanism to open or close the file name – The PASSIVE-FTP uses the passive transfer mode (the default). –ACTIVE-FTP use active transfer Mode –retr-symlinks when recursive, link to file (instead of directory) recursively download parameters:-r,–recursive Recursive download--use!-l carefully , the –level=number maximum recursion depth (INF or 0 for Infinity) –delete-after The local delete file after now-k,–convert-links convert non-relative link to relative link-k,–backup-converted in the transform file X, back up as x.orig-m,–mirror equivalent to-r-n-l inf-nr-p,–page-requisites download display HTML file for all images that are included and not included in the recursive Download (accept/reject):-a,–acce Pt=list semicolon-delimited list of accepted extensions-r,–reject=list semicolon-delimited list of non-accepted extensions-d,–domains=list semicolon-delimited list of accepted domains –exclude-domains=list Semicolon-delimited list of non-accepted domains –follow-ftp the list of tracked HTML tags in the tracking HTML document of the FTP link –follow-tags=list semicolon-g,–ignore-tags=list Semicolon-delimited list of ignored HTML tags-h,–span-hosts when recursion goes to an external host-l,–relative only tracks relative links-i,–include-direcTories=list the list of allowed directories-x,–exclude-directories=list not be included in the list of directories-np,–no-parent do not go back to the parent directory Wget-s–spider URL does not download only show procedure 4. Use instance: Instance 1: Download single file with wget command: wget http://www.minjieren.com/wordpress-3.1-zh_ Cn.zip Description: The following example is to download a file from the network and save in the current directory, the download process will show a progress bar, including (download complete percentage, the downloaded bytes, the current download speed, the remaining download time). Example 2: Download with Wget-o and save the command with a different file name:: Wget-o wordpress.zip http://www.minjieren.com/download.aspx?id=1080 Description: wget defaults to the last match The "/" character is followed by the command, and the file name is usually incorrect for dynamically linked downloads. Error: The following example downloads a file and saves wget with the name download.aspx?id=1080 http://www.minjieren.com/download?id=1 even if the downloaded file is in zip format, It is still with the download.php?id=1080 command. Correct: To solve this problem, we can use the parameter-o to specify a file name: Wget-o wordpress.zip http://www.minjieren.com/download.aspx?id=1080 Instance 3: Using wget– Limit-rate speed limit Download command: wget--limit-rate=300k http://www.minjieren.com/wordpress-3.1-zh_ Cn.zip Description: When you execute wget, it will use all possible broadband downloads by default. But when you're ready to download a large file and you still need to download other files, it's necessary to limit the speed. Example 4: Use the Wget-c breakpoint to continue the command: wget-c http://www.minjieren.com/wordpress-3.1-zh_CN.zip Description: Use wget-c to restart the download of interrupted files, It is very helpful for us to download a large file because of the interruption of network and other reasons, we can continue to download the file instead of downloading it again. You can use the-c parameter when you need to continue the interrupted download. Example 5: Using the wget-b background download command: Wget-b Http://www.minjieren.com/wordpress-3.1-zh_CN.zip Description: For downloading very large files, we can use parameter-B for background download. Wget-b http://www.minjieren.com/wordpress-3.1-zh_CN.zipContinuing in background, PID 1840.Output would be is written to ' WGE T-log '. You can use the following command to view the download progress: Tail-f Wget-log Instance 6: Masquerading proxy name download command: wget--user-agent= "mozilla/5.0 (Windows; U Windows NT 6.1; En-US) applewebkit/534.16 (khtml, like Gecko) chrome/10.0.648.204 safari/534.16 "http://www.minjieren.com/ Wordpress-3.1-zh_cn.zip Description: Some websites can reject your download request by judging the proxy name as not a browser. But you can disguise it by –user-agent parameters. Example 7: Using the wget–spider test download LINK command: wget--spider URL Description: When you plan to do a timed download, you should test the download link at the scheduled time. We can increase the –spider parameter to check. wget--spider URL If the download link is correct, wget--spider urlspider mode enabled will be displayed. Check if remote file exists. HTTP request sent, awaiting response ... oklength:unspecified [Text/html]remote file exists and could contain further links,but recursion is disabled-not r Etrieving. This ensures that the download can be made at the scheduled time, but when you give the wrong link, the following error will be displayed wget--spider urlspider mode enabled. Check if remote file exists. HTTP request sent, awaiting reSponse ... 404 Not foundremote File does not exist--broken link!!! You can use the spider parameter in the following cases: Check interval before scheduled download check whether the site is available to inspect the dead link instance of the Site page 8: Use wget–tries to increase the number of retries command: wget--tries=40 URL Description: It is also possible to fail if the network is having problems or downloading a large file. wget default retry 20 connection download file. If necessary, you can use –tries to increase the number of retries. Example 9: Download multiple files using wget-i command: Wget-i filelist.txt Description: First, save a download link file cat > Filelist.txturl1url2url3url4 then use this file and the parameter-I download instance 10: Use the wget–mirror Mirror site command: wget--mirror-p--convert-links-p./local URL Description: Download the entire site to local. –miror: Account opening image download-P: Download all for HTML page display normal file –convert-links: After download, convert cost to link-P./local: Save all files and directories to local specified directory instance 11: Use wget– Reject filter The specified format download command: wget--reject=gif ur description: Download a website, but you do not want to download pictures, you can use the following command. Example 12: Use Wget-o to save the download information to the log file command: Wget-o download.log URL Description: Do not want the download information directly displayed in the terminal but in a log file, you can use instance 13: Use the wget-q limit total download file size command: wget- Q5m-i filelist.txt Description: When you want to download the file more than 5M and exit the download, you can use. Note: This parameter does not work for a single file download and is only valid for recursive downloads. Example 14: Using Wget-r-A to download the specified format File command: Wget-r-a.pdf URL Description: You can use this feature in the following situations: Download all the pictures of a website download all the videos of a website download a website for all the PDF files Instance 15: Use Wget FTP download command: wget ftp-urlwget--ftp-user=username--ftp-password=password URL Description: You can use wget to complete the download of the FTP link. Use wget anonymous FTP Download: wget ftp-url ftp download using wget username and password authentication wget--ftp-user=username--ftp-password=password URL Note: Compile the installation using the following command to compile the installation: # tar ZXVF wget-1.9.1.tar.gz # cd wget-1.9.1 #./configure # make # make Install
wget commands in Linux