1. Curl (File Transfer tool)
Common parameters are as follows:
-c,--cookie-jar: Writing a cookie to a file
-b,--cookie: Reading cookies from a file
-c,--continue-at: The breakpoint continues to pass
-d,--data:http Post Mode transfer data
-d,--dump-header: Writing header information to a file
-f,--from: Analog HTTP expression submission data
-s,--slient: Reduce output information
-o,--output: outputting information to a file
-o,--remote-name: According to the file name on the server, there is a local
--l,--head: Return header information only
-u,--user[user:pass]: Set HTTP authentication user and password
-t,--upload-file: Uploading Files
-e,--referer: Specifying a reference address
-x,--proxy: Specify proxy server address and port
-w,--write-out: Output Specified format content
--retry: Number of retries
--connect-timeout: Specifies the maximum time to attempt a connection/s
Examples of Use:
Example 1: Crawl the page to the specified file, if there is garbled can use Iconv transcoding
# Curl-o baidu.html www.baidu.com
# curl–s–o baidu.html www.baidu.com |iconv-f utf-8 #减少输出信息
Example 2: Simulating the browser header (user-agent)
# curl-a "mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0) "Www.baidu.com
Example 3: Handling redirect Pages
# curl–l http://192.168.1.100/301.php #默认curl是不处理重定向
Example 4: Impersonate a user login, save cookie information to a Cookies.txt file, and then log in using a cookie
# curl-c./cookies.txt-f name=user-f pwd=***url #NAME和PWD是表单属性不同, each site is fundamentally different
# curl-b./cookies.txt–o URL
Example 5: Get HTTP response header headers
# curl-i Http://www.baidu.com
# curl-d./header.txt http://www.baidu.com #将headers保存到文件中
Example 6: Accessing the HTTP Authentication page
# curl–u User:pass URL
Example 7: Uploading and downloading files via FTP
# curl-t filename ftp://user:[email protected]/docs #上传
# Curl-o Ftp://user:[email protected]/filename #下载
2. wget (file Download tool)
Common parameters are as follows:
2.1 Startup Parameters
-v,--version: Show version number
-h,--help: View Help
-b,--background: Boot to background execution
2.2 Logging and input file parameters
-o,--output-file=file: Write records to File
-a,--append-output=file: Append records to File
-i,--input-file=file: Read URL from file to download
2.3 Download Parameters
-bind-address=address: Specify local use Address
-t,-tries=number: Set maximum number of attempted connections
-c,-continue: Then download the files that are not finished downloading
-o,-output-document=file: Write download to file
-spider: Do not download files
-T,-TIMEOUT=SEC: Setting response time-out
-w,-wait=sec: Interval between two attempts
--limit-rate=rate: Limit Download rate
-progress=type: Setting the progress bar
-Q,--quiet quiet (no output).
2.4 Directory Parameters
-p,-directory-prefix=prefix: Save the file to the specified directory
2.5 http Parameters
-http-user=user: Setting the HTTP user name
-http-passwd=pass: Setting the HTTP password
-u,--user-agent=agent: Disguise Proxy
-no-http-keep-alive: Turn off the HTTP active link and turn it into a permanent link
-cookies=off: Do not use cookies
-load-cookies=file: Loading cookies from file files before starting a session
-save-cookies=file: Save cookies to file at end of session
2.6 FTP Parameters
-PASSIVE-FTP: Default value, using passive mode
-ACTIVE-FTP: Using Active mode
2.7 Recursive download exclusion parameters
-a,--accept=list: Semicolon splits the list of downloaded extensions
-r,--reject=list: Semicolon split List of extensions not being downloaded
-d,--domains=list: Semicolon splits the list of downloaded domains
--exclude-domains=list: Semicolon split List of domains not being downloaded
Examples of Use:
Example 1: Download a single file to the current directory, or you can specify the download directory by-P
# wgethttp://nginx.org/download/nginx-1.8.0.tar.gz
Example 2: For network instability users can use the-C and--tries parameters to ensure that the download is complete
# wget--tries=20-c http://nginx.org/download/nginx-1.8.0.tar.gz
Example 3: When downloading a large file, we can put it in the background to download, then generate the Wget-log file to save the download progress
# Wget-b Http://nginx.org/download/nginx-1.8.0.tar.gz
Example 4: You can use the-spider parameter to determine if the URL is valid
# wget--spider http://nginx.org/download/nginx-1.8.0.tar.gz
Example 5: Automatically download files from multiple links
# Cat Url_list.txt #先创建一个URL文件
Http://nginx.org/download/nginx-1.8.0.tar.gz
Http://nginx.org/download/nginx-1.6.3.tar.gz
# wget-i Url_list.txt
Example 6: Limit download speed
# wget--limit-rate=1m http://nginx.org/download/nginx-1.8.0.tar.gz
Example 7: Login ftp Download file
# wget--ftp-user=user--ftp-password=pass ftp://ip/filename
This article is from the "Baby God" blog, make sure to keep this source http://babyshen.blog.51cto.com/8405584/1884852
Linux tool curl and wget advanced use