Curl and wget advanced use of Linux tools
Curl and wget advanced use of Linux tools
1. curl (file transfer tool)
Common parameters are as follows:
-C, -- cookie-jar: writes a cookie to a file.
-B, -- cookie: Read the cookie from the file
-C, -- continue-at: resumable upload
-D, -- data: transmits data through http post
-D, -- dump-header: Write the header information to the file.
-F, -- from: Simulate http expression to submit data
-S, -- slient: reduces output information
-O, -- output: outputs information to a file.
-O, -- remote-name: according to the file name on the server, there is a local
-- L, -- head: only the header information is returned.
-U, -- user [user: pass]: sets the http authentication user and password
-T, -- upload-file: upload a file
-E, -- referer: Specifies the reference address.
-X, -- proxy: Specifies the proxy server address and port
-W, -- write-out: output the content in the specified format
-- Retry: Number of retries
-- Connect-timeout: specifies the maximum connection time/s.
Example:
Example 1: capture the page to the specified file. If garbled characters exist, use iconv transcoding.
# Curl-o baidu.html www.baidu.com
# Curl-s-o baidu.html www.baidu.com | iconv-f UTF-8 # reduce output information
Example 2: Simulate the browser header (user-agent)
# Curl-A "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)" www.baidu.com
Example 3: process the redirection page
# Curl-L http: // 192.168.1.100/301.php # default curl does not handle redirection
Example 4: Simulate user login, save cookieinformation to the cookies.txt file, and then use cookies to log on
# Curl-c./cookies.txt-f name = user-f pwd = *** URL # name and pwd are different in form attributes, and each website is basically different.
# Curl-B./cookies.txt-o URL
Example 5: Get the HTTP Response Header headers
# Curl-I http://www.baidu.com
# Curl-D./header.txt http://www.baidu.com # Save headers to the file
Example 6: Access the HTTP authentication page
# Curl-u user: pass URL
Example 7: upload and download files through ftp
# Curl-T filename ftp: // user: pass @ ip/docs # upload
# Curl-O ftp: // user: pass @ ip/filename # download
2. wget (File Download tool)
Common parameters are as follows:
2.1 startup parameters
-V, -- version: displays the version number.
-H, -- help: View help
-B, -- background: transferred to the background for execution after startup
2.2 log records and input file Parameters
-O, -- output-file = file: writes records to the file.
-A, -- append-output = file: append the record to the file.
-I, -- input-file = file: reads the url from the file to download
2.3 download Parameters
-Bind-address = address: Specifies the local address.
-T,-tries = number: sets the maximum number of attempted connections.
-C,-continue: download the files that have not been downloaded.
-O,-output-document = file: Write the downloaded content to the file.
-Spider: Do not download files
-T,-timeout = sec: Set the response timeout time
-W,-wait = sec: interval between two attempts
-- Limit-rate = rate: the download speed is limited.
-Progress = type: Set the progress bar
2.4 directory Parameters
-P,-directory-prefix = prefix: Save the file to the specified directory.
2.5 HTTP parameters
-Http-user = user: Set the http user Name
-Http-passwd = pass: sets the http password.
-U, -- user-agent = agent: disguised proxy
-No-http-keep-alive: closes the http activity link and changes it to a permanent link.
-Cookies = off: Do not use cookies
-Load-cookies = file: loads cookies from the file before the session starts.
-Save-cookies = file: after the session ends, the cookies are saved to the file.
2.6 FTP Parameters
-Passive-ftp: default value. passive mode is used.
-Active-ftp: active Mode
2.7 recursive download exclusion Parameters
-A, -- accept = list: semicolon splits the list of downloaded extensions
-R, -- reject = list: semicolon separated list of undownloaded extensions
-D, -- domains = list: semicolon splits the list of downloaded domains
-- Exclude-domains = list: a semicolon-separated list of excluded domains
Example:
Example 1: Download a single file to the current directory. You can also use-P to specify the download directory.
# Wgethttp: // nginx.org/download/nginx-1.8.0.tar.gz
Example 2: If the network is unstable, you can use the-c and -- tries parameters to ensure the download is complete.
# Wget -- tries = 20-c http://nginx.org/download/nginx-1.8.0.tar.gz
Example 3: When downloading large files, we can download them in the background. A wget-log file is generated to save the download progress.
# Wget-B http://nginx.org/download/nginx-1.8.0.tar.gz
Example 4: Use the-spider parameter to determine whether the URL is valid.
# Wget -- spider http://nginx.org/download/nginx-1.8.0.tar.gz.
Example 5: automatically download files from multiple links
# Cat url_list.txt # create a URL file first
Http://nginx.org/download/nginx-1.8.0.tar.gz
Http://nginx.org/download/nginx-1.6.3.tar.gz
# Wget-I url_list.txt
Example 6: Restrict download speed
# Wget -- limit-rate = 1 m http://nginx.org/download/nginx-1.8.0.tar.gz
Example 7: log on to ftp to download files
# Wget -- ftp-user = user -- ftp-password = pass ftp: // ip/filename
Ubuntu users install the download tool cURL 7.36.0
Linux curl
Sharing of Curl usage and common functions in Unix
Curl command
Linux wget command details
Use wget/aria2 for offline bulk download in Linux
An error is reported when wget is used in Linux.
Linux download command wget Usage Details
Wget
Tips for using the Linux Command Line download tool wget
Wget command instance
This article permanently updates the link address: