Linux Instructions Curl Instruction detailed

Source: Internet
Author: User
Tags add time http authentication http post

Instruction: Curl
In Linux, Curl is a file transfer tool that works under the command line using URL rules, which can be said to be a powerful HTTP command-line tool. It supports file upload and download, is a comprehensive transfer tool, but by tradition, the custom called URL for download tool.


Syntax: # curl [option] [url]


Common parameters:
-a/--user-agent <string> set up user agent to send to server
-b/--cookie <name=string/file> Cookie string or file read location
-c/--cookie-jar <file> Write cookies to this file after the operation is complete
-c/--continue-at <offset> Breakpoint Continuation
-d/--dump-header <file> Write header information to this file
-e/--referer Source URL
HTTP error not displayed when-f/--fail connection fails
-o/--output writes the output to the file
-o/--remote-name writes the output to the file, preserving the file name of the remote
-r/--range <range> retrieving byte ranges from http/1.1 or FTP servers
-s/--silent Mute mode. Don't output anything.
-t/--upload-file <file> Uploading files
-u/--user <user[:p assword]> setting up the user and password for the server
-w/--write-out [format] what output is complete
-x/--proxy -#/--progress-bar progress bar shows the current delivery status


Example:
1. Basic usage
# Curl Http://www.doiido.com
Once executed, the www.doiido.com HTML will be displayed on the screen.
Ps: Since the installation of Linux is often not installed on the desktop, also means that there is no browser, so this method is frequently used to test whether a server can reach a website


2. Save the visited Web page
2.1: Save using the redirection feature of Linux
# Curl Http://www.doiido.com >> doiido.html



2.2: You can use Curl's built-in option:-o (lowercase) to save Web pages
$ Curl-o doiido.html http://www.doiido.com
After execution, the following screen appears, displaying 100% indicating a successful save
% total% Received% xferd Average speed Time Time current
Dload Upload Total spent
79684 0 79684 0 0 3437k 0--:--:----:--:----:--:--7781k



2.3: You can use Curl's built-in option:-O (uppercase) to save files in a Web page
Note that the URL behind this is specific to a file, or you can't catch it.
# Curl-o Http://www.doiido.com/hello.sh


3. Test page return value
# Curl-o/dev/null-s-W%{http_code} www.doiido.com
Ps: In the script, it is very common to test whether the website is in normal use


4. Specify proxy server and its port
Most of the time the Internet needs to use a proxy server (such as using a proxy server to surf the Internet or because of the use of curl other people's websites to block the IP address), fortunately, curl by using the built-in option:-x to support the setup agent
# curl-x 192.168.100.100:1080 http://www.doiido.com


5. Cookies
Some websites use cookies to record session information. Cookies can be easily processed by browsers such as Chrome, but it is easy to process cookies in curl by adding relevant parameters.
5.1: Save the cookie information inside the HTTP response. Built-in option:-c (lowercase)
# curl-c Cookiec.txt http://www.doiido.com
Cookie information is stored in cookiec.txt after execution.


5.2: Keep the header information inside the HTTP response. Built-in option:-D
# curl-d Cookied.txt http://www.doiido.com
Cookie information is stored in cookied.txt after execution.

Note: The cookie generated by-C (lowercase) and the-D cookie are not the same.


5.3: Using Cookies
Many websites monitor your cookie information to see if you have access to their website as a rule, so we need to use stored cookie information. Built-in option:-B
# curl-b Cookiec.txt http://www.doiido.com


6, imitate the browser
Some websites need to use a specific browser to access them, and some need to use certain specific versions. Curl built-in option:-a allows us to specify a browser to access the site
# curl-a "mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0) "Http://www.doiido.com
So the server side will be considered to be accessed using IE8.0.


7, forged Referer (hotlinking)
Many servers check HTTP access to Referer to control access. For example: You first visit the homepage, and then visit the home page of the mailbox, where access to the mailbox Referer address is to visit the page after the successful homepage address, if the server found on the mailbox page access to the Referer address is not the home address, it is concluded that it is a stolen
The built-in OPTION:-E in curl allows us to set Referer
# curl-e "Www.doiido.com" http://mail.doiido.com
This will make the server think that you have clicked a link from www.doiido.com.


8. Download the file
8.1: Download the file using Curl.
#使用内置option:-O (lowercase)
# Curl-o Dodo1.jpg http:www.doiido.com/dodo1.jpg

#使用内置option:-O (uppercase)
# Curl-o Http://www.doiido.com/dodo1.JPG
This will save the file on the server name to the local



8.2: Cyclic download
Sometimes download pictures can be the previous part of the name is the same, the last tailbone name is not the same
# Curl-o Http://www.doiido.com/dodo[1-5]. Jpg
This will save the DODO1,DODO2,DODO3,DODO4,DODO5 all.



8.3: Under load name
# Curl-o Http://www.doiido.com/{hello,bb}/dodo[1-5]. Jpg
Because the downloaded hello and the file name in the BB are dodo1,dodo2,dodo3,dodo4,dodo5. So the second download will overwrite the first download, so you need to rename the file.
# Curl-o #1_ #2.jpg http://www.doiido.com/{hello,bb}/dodo[1-5]. Jpg
This is hello/dodo1. JPG files will become hello_dodo1 when downloaded. JPG, other files and so on, which effectively avoids overwriting the file


8.4: Sub-block download
Sometimes the download of things will be relatively large, this time we can download sections. Using the built-in Option:-r
# Curl-r 0-100-o Dodo1_part1. JPG Http://www.doiido.com/dodo1.JPG
# Curl-r 100-200-o Dodo1_part2. JPG Http://www.doiido.com/dodo1.JPG
# curl-r 200--o dodo1_part3. JPG Http://www.doiido.com/dodo1.JPG
# cat dodo1_part* > Dodo1. Jpg
This will allow you to view the Dodo1. JPG's content.


8.5: Download files via FTP
Curl can download files via FTP, Curl provides two kinds of syntax to download from FTP
# Curl-o-u user name: Password Ftp://www.doiido.com/dodo1.JPG
# Curl-o ftp://user name: Password @www.doiido.com/dodo1.jpg


8.6: Show Download progress bar
# Curl-#-O Http://www.doiido.com/dodo1.JPG


8.7: Download Progress information is not displayed
# curl-s-O Http://www.doiido.com/dodo1.JPG


9, the breakpoint continues to pass
In Windows, we can use software such as Thunderbolt to continue the breakpoint. Curl can also achieve the same effect with built-in option:-c
If you are downloading Dodo1. JPG in the process of a sudden drop off, you can use the following ways to continue to pass
# curl-c-O Http://www.doiido.com/dodo1.JPG


10. Uploading Files
Curl can not only download files, but also upload files. With built-in option:-t to achieve
# curl-t Dodo1. Jpg-u User name: Password ftp://www.doiido.com/img/
This uploads the file Dodo1 to the FTP server. Jpg


11. Display Crawl Error
# curl-f Http://www.doiido.com/error



Other parameters (translated here for reprint):
-a/--append attaching to the destination file when uploading a file
--anyauth can use the "any" authentication method
--basic using HTTP Basic authentication
-b/--use-ascii using ASCII text transfer
-d/--data <data> HTTP Post mode transfer data
--data-ascii <data> Post data in ASCII mode
--data-binary <data> post data in binary mode
--negotiate using HTTP Authentication
--digest using digital authentication
--DISABLE-EPRT prohibit the use of eprt or LPRT
--DISABLE-EPSV prohibit the use of EPSV
--egd-file <file> Set EGD socket path for random data (SSL)
--tcp-nodelay using the Tcp_nodelay option
-e/--cert <cert[:p asswd]> client certificate file and password (SSL)
--cert-type <type> certificate file type (Der/pem/eng) (SSL)
--key <key> private Key file name (SSL)
--key-type <type> private key file type (Der/pem/eng) (SSL)
--pass <pass> private Key password (SSL)
--engine <eng> Encryption engine use (SSL). "--engine List" for list
--cacert <file> CA Certificate (SSL)
--capath <directory> CA directory (made using C_rehash) to verify peer against (SSL)
--ciphers <list> SSL Password
--compressed requirements Return is a compressed situation (using deflate or gzip)
--connect-timeout <seconds> Set Maximum request time
--create-dirs Creating a directory hierarchy of local directories
--crlf upload is to convert LF into CRLF
--ftp-create-dirs If the remote directory does not exist, create a remote directory
--ftp-method [MULTICWD/NOCWD/SINGLECWD] controlling the use of CWD
--FTP-PASV using PASV/EPSV instead of ports
--ftp-skip-pasv-ip when using PASV, ignore the IP address
--ftp-ssl attempt to use SSL/TLS for FTP data transfer
--FTP-SSL-REQD requires SSL/TLS for FTP data transfer
-f/--form <name=content> analog HTTP form submission data
-form-string <name=string> analog HTTP form submission data
-g/--globoff Disable URL sequence and range using {} and []
-g/--get sending data in a get way
-h/--help Help
-h/--header <line> Custom header information to the server
--ignore-content-length the length of the HTTP header information ignored
-i/--include output includes protocol header information
-i/--head Display only Document information
-j/--junk-session-cookies Ignore session cookie when reading files
--interface <interface> using specified network interface/address
--KRB4 <level> using KRB4 with the specified security level
-k/--insecure allow non-use of certificates to SSL sites
-k/--config The specified configuration file read
-l/--list-only lists the file names under the FTP directory
--limit-rate <rate> Set Transfer speed
--local-port<num> forcing the use of local port numbers
-m/--max-time <seconds> Set Maximum transfer time
--max-redirs <num> Set the maximum number of read directories
--max-filesize <bytes> Set the maximum amount of files to download
-m/--manual Display Full Manual
-N/--NETRC to read the user name and password from the Netrc file
--netrc-optional use. netrc or URL to overwrite-n
--NTLM using HTTP NTLM authentication
-n/--no-buffer Disabling buffered output
-p/--proxytunnel using an HTTP proxy
--proxy-anyauth Select either Proxy authentication method
--proxy-basic using Basic authentication on the agent
--proxy-digest using digital authentication on the proxy
--PROXY-NTLM using NTLM authentication on the agent
-p/--ftp-port <address> use port address instead of PASV
-q/--quote <cmd> Send a command to the server before transferring the file
--range-file Read (SSL) random files
-r/--remote-time Preserve remote file time when generating files locally
--retry <num> Number of retries when there is a problem with the transmission
--retry-delay <seconds> Set retry interval when problems occur with transmission
--retry-max-time <seconds> Set Maximum retry time when there is a problem with the transmission
-s/--show-error Display Error
--SOCKS4 --SOCKS5 -t/--telnet-option <OPT=val> telnet option settings
--trace <file> debug a specified file
--trace-ascii <file> like--trace but no hex output
Add time stamp when--trace-time trace/verbose output
--url <URL> spet URL to work with
-u/--proxy-user <user[:p assword]> Setting the proxy user name and password
-v/--version displaying version information
-x/--request <command> Specify what commands
-y/--speed-time the time required to abandon the speed limit. Default is 30
-y/--speed-limit Stop transmission speed limit, speed time ' s
-z/--time-cond Transfer Time setting
-0/--http1.0 Using HTTP 1.0
-1/--tlsv1 using TLSV1 (SSL)
-2/--sslv2 using SSLv2 (SSL)
-3/--sslv3 used by SSLv3 (SSL)
--3p-quote Like-q for the source URL for 3rd party transfer
--3p-url using URLs for third-party transfers
--3p-user third-party transfer using username and password
-4/--ipv4 using IP4
-6/--ipv6 using IP6

Linux Instructions Curl Instruction detailed

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.