Use of centos wget

Source: Internet
Author: User

  • Abstract: centos wget is a free tool for automatically downloading files from the network. It supports HTTP, https, and FTP protocols and can use HTTP proxy.
    The so-called automatic download means that centos wget can be executed in the background after the user exits the system. This means that you can log on to the system, start a centos wget download task, and exit the system. centos wget will be executed in the background until the task is completed, compared with most other browsers, users need to participate in downloading a large amount of data, which saves a lot of trouble.
  • Tags: centos
    Wget

    We recommend a very good centos wget system with great learning value. Here I will mainly explain the application of centos wget system, wget is a command line download tool that describes the background knowledge of the centos wget package. Linux users are using it almost every day. The following describes several useful centos wget tips to help you use centos more efficiently and flexibly.
    Wget.

    Tips for using centos wget

    $ Centos wget-r-NP-nd http://example.com/packages/ this command downloads all files in the packages directory on the http://example.com website. -NP does not traverse the parent directory.-nd indicates that the directory structure is not re-created on the local machine.

    $ Centos wget-r-NP-Nd -- accept = ISO http://example.com/centos-5/i386/ is similar to the previous command, but an added -- accept = ISO option, this indicates that centos wget only downloads all files with the ISO extension in the i386 directory. You can also specify multiple extensions, which can be separated by commas.

    $ Centos wget-I filename.txt this command is often used for batch download. Put the addresses of all files to be downloaded into filename.txt, and then centos wget will automatically download all files for you.

    $ Centos wget-C http://example.com/really-big-file.iso the-C option specified here is used for resumable upload.

    $ Centos wget-m-K (-h) http://www.example.com/this command can be used to mirror a website where centos wget will convert the link. If the images on the website are placed on another site, you can use the-H option.

    Centos wget User Guide

    Centos wget is a free tool for automatically downloading files from the network. It supports HTTP, https, and FTP protocols and can use HTTP proxy. The so-called automatic download means that centos wget can be executed in the background after the user exits the system. This means that you can log on to the system, start a centos wget download task, and exit the system. centos wget will be executed in the background until the task is completed, compared with most other browsers, users need to participate in downloading a large amount of data, which saves a lot of trouble.

    Wget allows you to track links on the HTML page and download them to create the local version of the remote server, completely recreating the directory structure of the original site. This is often referred to as "recursive download ". During recursive download, wget complies with the robot exclusion standard (/robots.txt). wget can convert the link to a local file while downloading to facilitate offline browsing.

    Wget is very stable. It has strong adaptability to unstable networks with narrow bandwidth. if the download fails due to network reasons, wget keeps trying until the entire file is downloaded. If the server interrupts the download process, it will be again connected to the server to continue the download from the stopped place. This is very useful for downloading large files from servers with limited connection time.

    Common usage of wget

    Usage: wget [Option]... [url]... use wget for site image: wget-r-p-NP-K http://dsec.pku.edu.cn /~ Usr_name/# Or wget-M timeout
    Wget-T 0-W 31-C http://dsec.pku.edu.cn/BBC.avi-O down. log & # or read the list of files to be downloaded from filelist

    Wget-T 0-W 31-C-B ftp://dsec.pku.edu.cn/linuxsoft-I filelist.txt-O down. log &

    The above code can also be used for downloading when the network is relatively idle. My usage is: In mozillawill not easily download the urlchain to the memory, then paste it to the filelist.txt file, and execute the second code above before going out of the system at night.

    Download wget-y on-p-k Using proxy # Set proxy export proxy = http: // 211.90.168.94: 8080/# In ~ /. Set proxy in wgetrc

    Http_proxy = http://proxy.yoyodyne.com: 18023/

    Ftp_proxy = http://proxy.yoyodyne.com: 18023/

    Wget various options category list started

    -V, -- version: displays the wget version and exits.

    -H, -- help print syntax help

    -B, -- after the background is started, it is transferred to the background for execution.

    -E, -- execute = command: Execute the command in the '. wgetrc' format. For the wgetrc format, see/etc/wgetrc or ~ /. Wgetrc

    Record and input file

    -O, -- output-file = file: Write the record to the file.

    -A, -- append-output = file: append the record to the file.

    -D, -- debug print debugging output

    -Q, -- Quiet quiet mode (no output)

    -V, -- verbose lengthy mode (this is the default setting)

    -NV, -- Non-verbose turn off the lengthy mode, but not the quiet mode

    -I, -- input-file = file: the URL that appears when the file is downloaded.

    -F, -- force-HTML treats the input file as an HTML file

    -B, -- base = URL uses the URL as the prefix of the relative link in the file specified by the-f-I Parameter

    -- Sslcertfile = file: Optional client certificate

    -- Sslcertkey = Keyfile the Keyfile of the client certificate is optional.

    -- EGD-file = file specifies the file name of the EGD socket.

    Download

    -- Bind-address = address specifies the local address (host name or IP address, used when there are multiple local IP addresses or names)

    -T, -- tries = Number sets the maximum number of attempts to connect (0 indicates no limit ).

    -O -- output-document = file: Write the document to the file.

    -NC, -- no-clobber do not overwrite existing files or use the. # prefix

    -C, -- continue, and then download the files that have not been downloaded

    -- Progress = type: set the process bar flag

    -N, -- timestamping

    -S, -- server-response print the Server Response

    -- Spider does not download anything

    -T, -- timeout = seconds: set the number of seconds for response timeout.

    -W, -- Wait = seconds: the interval between two attempts is seconds.

    -- Waitretry = waiting for 1... seconds between reconnections

    -- Random-Wait waits for 0 seconds between downloads... 2 * Wait

    -Y, -- proxy = On/Off open or close the proxy

    -Q, -- quota = Number: Set the download capacity limit.

    -- Limit-rate = Rate: Specifies the download rate.

    Directory

    -Nd -- no-Directories

    -X, -- force-directories force Directory Creation

    -NH, -- no-host-directories do not create the host directory

    -P, -- directory-Prefix = Prefix: save the file to the directory prefix /...

    -- Cut-dirs = Number ignore the remote directory of the number layer

    HTTP OPTIONS

    -- Http-user = User: Set the HTTP user name to user.

    -- Http-passwd = pass: Set the HTTP password to pass.

    -C, -- cache = On/Off allow/Do Not Allow server-side data caching (generally allow ).

    -E, -- HTML-Extension: Save all text/html files with the. html Extension

    -- Ignore-length ignore the 'content-length' header field

    -- Header = string insert string in Headers

    -- Proxy-user = User: Set the proxy username to user.

    -- Proxy-passwd = pass: Set the proxy password to pass

    -- Referer = the URL contains the 'Referer' header in the HTTP request.

    -S, -- save-headers Save the HTTP header to the file

    -U, -- User-Agent = agent: Set the proxy name to agent instead of wget/version.

    -- No-http-keep-alive disable the HTTP activity Link (permanent link ).

    -- Cookies = off do not use cookies.

    -- Load-Cookies = file: load the cookie from the file before starting the session

    -- Save-Cookies = file: saves cookies to the file after the session ends.

    FTP options

    -Nr, -- Dont-Remove-listing: Do not remove the '. listing' file.

    -G, -- glob = On/Off enable or disable the globbing mechanism of the file name

    -- Passive-FTP uses passive transmission mode (default ).

    -- Active-FTP Active Transmission Mode

    -- Retr-symlinks: recursively points the link to a file instead of a directory)

    Recursive download

    -R, -- Recursive recursive download-use with caution!

    -L, -- level = maximum recursive depth of number (inf or 0 indicates infinity ).

    -- Delete-after: Partial Deletion of objects after completion

    -K, -- convert-links converts non-relative links to relative links

    -K, -- backup-converted: Back up the file to X. orig before converting file x

    -M, -- mirror is equivalent to-r-N-l INF-Nr.

    -P, -- page-requisites download and display all images of HTML files

    Include and exclude (accept/reject) in recursive download)

    -A, -- accept = List a semicolon-separated list of accepted extensions

    -R, -- reject = List a semicolon-separated list of unacceptable extensions

    -D, -- domains = List the list of accepted domains separated by semicolons

    -- Exclude-domains = List a semicolon-separated list of unacceptable Domains

    -- Follow-FTP: Tracking FTP links in HTML documents

    -- Follow-tags = List a semicolon-separated list of HTML tags to be tracked

    -G, -- ignore-tags = List semicolon-separated list of ignored HTML tags

    -H, -- span-hosts is recursively transferred to the external host

    -L, -- relative only traces relative links

    -I, -- include-directories = List list of allowed Directories

    -X, -- exclude-directories = List list of excluded Directories

    -NP, -- no-parent should not be traced back to the parent directory


    Note: To stop downloading, CTRL + C.


    Reprinted Please note: http://hi.baidu.com/wowcser/blog/item/8008732168e91ba84623e8b5.html


  • Related Article

    Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.