Linux Terminal download file wget

Source: Internet
Author: User
Tags ftp file http authentication save file

Wget is the most common download tool under Linux:

There are two common ways to use it:

Directly down to the current directory

wget URL;

Download to a direct directory file

Wget-o fileName URL


The relevant parameters are as follows:

[Email protected] ~]# wget--help

GNU Wget 1.12, a non-interactive web file Download tool.
Usage: wget [options] ... [URL] ...

The parameters required for the long option are also required when using the short option.

Begin:
-V,--version displays the version information for Wget and exits.
-H,--help print this help.
-B,--background starts and goes backstage.
-E,--execute=command runs a '. wgetrc ' style command.

Log in and enter the file:
-O,--output-file=file writes information to file.
-A,--append-output=file adds the information to FILE.
-D,--debug prints a lot of debugging information.
-Q,--quiet Quiet mode (no information output).
-V,--verbose verbose output (this is the default value).
-NV,--no-verbose closes the verbose output, but does not enter quiet mode.
-I,--input-file=file download URLs from local or external file.
-F,--force-html the input file as an HTML file.
-B,--base=url parsing URL-related
HTML input file (specified by the-I-F option).

Download:
-T,--tries=number sets the number of retries (0 for unlimited).
--retry-connrefused Retry even if the connection is denied.
-O,--output-document=file writes the document to file.
-NC,--no-clobber do not repeatedly download files that already exist.

-C,--continue continue to download some of the downloaded files.
--progress=type Select a progress bar type.
-N,--timestamping only gets new files than local files.

-S,--server-response print server response.
--spider not download any files.
-T,--timeout=seconds sets all timeouts to SECONDS seconds.
--dns-timeout=secs Set the DNS lookup timeout to SECS seconds.
--connect-timeout=secs Set the connection timeout to SECS seconds.
--read-timeout=secs set the read timeout to SECS seconds.
-W,--wait=seconds wait interval is SECONDS seconds.
--waitretry=seconds waits for 1..SECONDS seconds during the retry of retrieving the file.
--random-wait waits 0...2*wait seconds when retrieving.
--no-proxy close the agent.
-Q,--quota=number sets the retrieval quota to number bytes.
The--bind-address=address is bound to the address (hostname or IP) on the local host.
--limit-rate=rate limit the download rate to rates.
--no-dns-cache shut down the DNS lookup cache.
The characters in the--restrict-file-names=os qualified file name are the characters allowed by the OS.
--ignore-case ignores case when matching files/directories.
-4, the--inet4-only is only connected to the IPV4 address.
-6, the--inet6-only is only connected to the IPV6 address.
--prefer-family=family first connection to the address of the specified protocol
FAMILY is Ipv6,ipv4 or none.
--user=user both FTP and HTTP username are set to user.
--password=pass both FTP and HTTP passwords are set to PASS.
--ask-password prompt to enter a password.
--no-iri off IRI support.
--local-encoding=enc IRI uses ENC as the local encoding.
--remote-encoding=enc uses ENC as the default remote encoding.

Directory:
-nd,--no-directories does not create a directory.
-X,--force-directories forces the creation of the directory.
-NH,--no-host-directories do not create a home directory.
--protocol-directories use the protocol name in the directory.
-P,--directory-prefix=prefix to prefix/... Save File
--cut-dirs=number ignores number remote directory paths.

HTTP options:
--http-user=user set the HTTP user name to username.
--http-password=pass set the HTTP password to pass.
--no-cache does not cache data on the server.
--default-page=name Changing the default page
(the default page is usually "index.html").
-E,--adjust-extension Save the HTML/CSS document with the appropriate extension.
--ignore-length ignores the ' content-length ' area of the head.
--header=string inserts a STRING in the head.
--max-redirect the maximum number of redirects allowed per page.
--proxy-user=user uses user as the proxy user name.
--proxy-password=pass uses PASS as the proxy password.
--referer=url contains ' Referer:url ' in the HTTP request header.
--save-headers Save the HTTP header to a file.
-U,--user-agent=agent is identified as an agent instead of wget/version.
--no-http-keep-alive Disable HTTP keep-alive (permanent connection).
--no-cookies does not use cookies.
--load-cookies=file the cookie from FILE before the session starts.
Save the cookies to FILE after the--save-cookies=file session ends.
--keep-session-cookies Load and save session (non-permanent) cookies.
--post-data=string uses post, and STRING is sent as data.
--post-file=file use post, send file content.
--content-disposition The local file name is selected
Allow Content-disposition head (still in the experiment).
--auth-no-challenge Send Basic HTTP authentication information
Without first waiting for the server ' s
Challenge.

HTTPS (SSL/TLS) options:
--SECURE-PROTOCOL=PR Select the security protocol, which can be auto, SSLv2,
SSLv3 or one of the TLSv1.
--no-check-certificate do not validate the server's certificate.
--certificate=file the client certificate file.
--certificate-type=type client certificate type, PEM or DER.
--private-key=file the private key file.
--private-key-type=type private key file type, PEM or DER.
--ca-certificate=file files with a set of CA certifications.
--ca-directory=dir the directory where the CA-certified hash list is saved.
--random-file=file a file with random data that generates SSL PRNG.
--egd-file=file is used to name a file with EGD sockets with random data.

FTP options:
--ftp-user=user set the FTP username to user.
--ftp-password=pass set the FTP password to pass.
--no-remove-listing do not delete the '. Listing ' file.
--no-glob does not expand using wildcards in the FTP file name.
--NO-PASSIVE-FTP Disable the "passive" transfer mode.
--retr-symlinks gets the linked file (not the directory) when the directory is recursively recursive.

Recursive Download:
-R,--recursive specifies a recursive download.
-L,--level=number maximum recursion depth (INF or 0 means no limit, i.e. all downloads).
--delete-after Delete the local file after the download is complete.
-K,--convert-links let the downloaded HTML or CSS link point to the local file.
-K,--backup-converted it back to X.orig before converting file X.
-M, abbreviated form of the--mirror-n-r-l inf--no-remove-listing.
-P,--page-requisites downloads all elements such as images used to display HTML pages.
--strict-comments turn on precise processing of HTML annotations (SGML).

Recursive Accept/Reject:
-A,--accept=list comma-delimited list of acceptable extensions.
-R,--reject=list comma-delimited list of extensions to reject.
-D,--domains=list a comma-delimited list of acceptable domains.
--exclude-domains=list comma-delimited list of domains to deny.
--follow-ftp tracks the FTP links in an HTML document.
--follow-tags=list comma-delimited list of HTML identifiers for traces.
--ignore-tags=list comma-delimited list of ignored HTML identifiers.
-H,--span-hosts to external host when recursive.
-L,--relative only tracks links that are related.
-I,--include-directories=list the list of allowed directories.
-X,--exclude-directories=list the list of excluded directories.
-NP,--no-parent does not go back to the parent directory.

This article is from the "gangbusters" blog, make sure to keep this source http://php2012web.blog.51cto.com/5585213/1614268

Linux Terminal download file wget

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.