1. Curl (File Transfer tool)
Common parameters are as follows:
-C,--cookie-jar: writes a cookie to a file
-B,--cookie: Read the cookiefrom the file
-C,--continue-at: Thebreakpoint continues to pass
-D,--data:http Post method to transmit data
-D,--dump-header: writes header information to a file
-F,--from: analog http expression submission data
-S,--slient: Reduce output information
-O,--output: Output information to a file
-O,--remote-name: According to the file name on the server, there is a local
--l,--head: Return header information only
-U,--user[user:pass]: Set http authentication user and password
-T,--upload-file: Uploading Files
-E,--referer: Specify a reference address
-X,--proxy: Specify proxy server address and port
-w,--write-out: Output Specified format content
--retry: Number of retries
--connect-timeout: Specifies the maximum time to attempt a connection/ s
Examples of Use:
Example 1: Crawl the page to the specified file, if there is garbled can use iconv transcoding
# Curl-o baidu.html www.baidu.com
# curl–s–o baidu.html www.baidu.com |iconv-f utf-8 # reduce output information
Example 2: Simulating the browser header (user-agent)
# curl-a "mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0) "Www.baidu.com
Example 3: Handling redirect Pages
# curl–l http://192.168.1.100/301.php # Default Curl is not handling redirection
Example 4: Impersonate a user login, save cookie information to a cookies.txt file, and then log in using a cookie
# curl-c./cookies.txt-f name=user-f pwd=***url #NAME and PWD are different form attributes, and each site is fundamentally different
# curl-b./cookies.txt–o URL
Example 5: Get HTTP response header headers
# curl-i Http://www.baidu.com
# curl-d./header.txt http://www.baidu.com # Save the headers to a file
Example 6: Accessing the HTTP authentication page
# curl–u User:pass URL
Example 7: uploading and downloading files via FTP
# curl-t filename ftp://user:[email protected]/docs # upload
# Curl-o ftp://user:[email protected]/filename # Download
Blog: http://lizhenliang.blog.51cto.com
2. wget (file Download tool)
Common parameters are as follows:
2.1 Startup Parameters
-V,--version: Displays the version number
-H,--help: View Help
-B,--background: Run back to background after start
2.2 Logging and input file parameters
-O,--output-file=file: Writes records to file
-A,--append-output=file: Append records to file
-I,--input-file=file: read URLfrom file to download
2.3 Download Parameters
-bind-address=address: Specify local use Address
-T,-tries=number: Sets the maximum number of attempts to connect
-C,-continue: Then download the files that are not finished downloading
-O,-output-document=file: Writes the download to the file
-spider: Do not download files
-T,-timeout=sec: Set response time-out
-W,-wait=sec: Interval between two attempts
--limit-rate=rate: Limit download rate
-progress=type: Setting the progress bar
2.4 Directory Parameters
-P,-directory-prefix=prefix: Save the file to the specified directory
2.5 HTTP Parameters
-http-user=user: Setting the HTTP user name
-http-passwd=pass: Setting the HTTP password
-U,--user-agent=agent: Camouflage proxy
-no-http-keep-alive: Turn off the HTTP active link and turn it into a permanent link
-cookies=off: Do not use cookies
-load-cookies=file: Loading cookies from file files before starting a session
-save-cookies=file: Save cookies to fileat end of Session
2.6 FTP Parameters
-PASSIVE-FTP: Default value, using passive mode
-ACTIVE-FTP: Using active mode
2.7 Recursive download exclusion parameters
-A,--accept=list: Semicolon splits the list of downloaded extensions
-R,--reject=list: Semicolon split List of extensions not being downloaded
-D,--domains=list: Semicolon splits the list of downloaded domains
--exclude-domains=list: Semicolon split List of domains not being downloaded
Examples of Use:
Example 1: Download a single file to the current directory, or you can specify the download directory by-p
# wgethttp://nginx.org/download/nginx-1.8.0.tar.gz
Example 2: For network instability users can use the-C and --tries parameters to ensure that the download is complete
# wget--tries=20-c http://nginx.org/download/nginx-1.8.0.tar.gz
Example 3: When downloading a large file, we can put it in the background to download, then generate the Wget-log file to save the download progress
# Wget-b Http://nginx.org/download/nginx-1.8.0.tar.gz
Example 4: You can use the-spider parameter to determine if the URL is valid
# wget--spider http://nginx.org/download/nginx-1.8.0.tar.gz
Example 5: Automatically download files from multiple links
# cat Url_list.txt # Create a URL file first
Http://nginx.org/download/nginx-1.8.0.tar.gz
Http://nginx.org/download/nginx-1.6.3.tar.gz
# wget-i Url_list.txt
Example 6: Limit download speed
# wget--limit-rate=1m http://nginx.org/download/nginx-1.8.0.tar.gz
Example 7: Login ftp download file
# wget--ftp-user=user--ftp-password=pass ftp://ip/filename
This article is from the "Penguin" blog, please be sure to keep this source http://lizhenliang.blog.51cto.com/7876557/1650663
Linux tool curl and wget advanced use