Linux Command curl command details, linux Command curl details

Source: Internet
Author: User
Tags http authentication

Linux Command curl command details, linux Command curl details
Command: curl
In linux, curl is a File Transfer tool that uses URL rules to work under the command line. It can be said that it is a powerful http command line tool. It supports file upload and download. It is a comprehensive transmission tool. However, traditionally, URLs are called download tools.


Syntax: # curl [option] [url]


Common parameters:
-A/-- user-agent <string>: sets the user proxy to be sent to the server.
-B/-- cookie <name = string/file> cookie string or file read location
-C/-- cookie-jar <file>: after the operation is completed, the cookie is written to this file.
-C/-- continue-at <offset> resumable Transfer
-D/-- dump-header <file>: Write the header information to this file.
-E/-- referer source URL
-F/-- the http error is not displayed when the fail connection fails.
-O/-- output: Write the output to the file.
-O/-- remote-name: Write the output to this file, and keep the file name of the remote file.
-R/-- range <range> to retrieve the byte range from HTTP/1.1 or FTP Server
-S/-- silent mute mode. No output
-T/-- upload-file <file> upload a file
-U/-- user <user [: password]> set the user and password of the server
-W/-- write-out [format] After what output is complete
-X/-- proxy -#/-- Progress-bar progress bar displays the current transfer status


Example:
1. Basic usage
# Curl http://www.doiido.com
After the execution, the html of www.doiido.com will be displayed on the screen.
Ps: Since the desktop is not installed many times during linux installation, it also means that there is no browser. Therefore, this method is often used to test whether a server can reach a website.


2. Save the accessed webpage
2.1: Use the linux redirection function to save
# Curl http://www.doiido.com> doiido.html



2.2: You can use the built-in option:-o (lower case) of curl to save the webpage
$ Curl-o doiido.html http://www.doiido.com
After the execution is complete, the following page is displayed. If 100% is displayed, the storage is successful.
% Total % Received % Xferd Average Speed Time Current
Dload Upload Total Spent Left Speed
100 79684 0 79684 0 3437 k 0 --: -- 7781 k



2.3: You can use the built-in option:-O (uppercase) of curl to save files on the webpage.
Note that the url next to this file must be specific to a file. Otherwise, the file cannot be captured.
# Curl-O http://www.doiido.com/hello.sh


3. Test the webpage Return Value
# Curl-o/dev/null-s-w % {http_code} www.doiido.com
Ps: In scripts, it is common to test whether a website is used normally.


4. Specify the proxy server and its port
In many cases, a proxy server is used to access the Internet (for example, when the proxy server is used to access the Internet or the IP address is blocked by others because of the curl website). Fortunately, curl uses the built-in option: -x to support proxy settings
# Curl-x 192.168.100.100: 1080 http://www.doiido.com


5. cookie
Some websites use cookies to record session information. For browsers such as chrome, cookie information can be easily processed, but it is easy to process cookies by adding relevant parameters to curl.
5.1: Save the cookie information in the http response. Built-in option:-c (lower case)
# Curl-c cookiec.txt http://www.doiido.com
After the execution, cookieinformation is saved to cookiec.txt.


5.2: Save the header information in the http response. Built-in option:-D
# Curl-D cookied.txt http://www.doiido.com
After the execution, cookieinformation is saved to cookied.txt.

Note: The cookie generated by-c (lower case) is different from the cookie in-D.


5.3: use cookie
Many websites monitor your cookie information to determine whether you access their websites according to rules. Therefore, we need to use the saved cookie information. Built-in option:-B
# Curl-B cookiec.txt http://www.doiido.com


6. Imitate the browser
Some websites need to use specific browsers to access them, and some also need to use certain versions. Curl built-in option:-A allows us to specify A browser to access the website
# Curl-A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.doiido.com
In this way, the server will consider it to be accessed using IE8.0.


7. Counterfeit referer (leeching)
Many servers check the http access referer to control the access. For example, you access the home page first, and then the mailbox page on the home page. the referer address for accessing the mailbox here is the address of the page after successfully accessing the home page, if the server finds that the referer address accessed by the mailbox page is not the address of the home page, it is determined that it is a stolen connection.
The built-in option in curl:-e allows us to set referer
# Curl-e "www.doiido.com" http://mail.doiido.com
In this way, the server will assume that you have clicked a link from www.doiido.com.


8. download files
8.1: Use curl to download an object.
# Use built-in option:-o (lower case)
# Curl-o dodo1.jpg http: www.doiido.com/dodo1.JPG

# Use built-in option:-O (uppercase)
# Curl-O http://www.doiido.com/dodo1.JPG
In this way, the file will be saved to the local device by name on the server.



8.2: loop download
Sometimes, you can download images with the same names as the previous ones, but the names of the final tailspine are different.
# Curl-O http://www.doiido.com/dodo?1-5=.jpg
In this way, all dodo1, dodo2, dodo3, dodo4, and dodo5 will be saved.



8.3: Download and rename
# Curl-O http://www.doiido.com/?hello,bb#/dodo?1-5=.jpg
The downloaded hello and bb files are named dodo1, dodo2, dodo3, dodo4, and dodo5. Therefore, the second download will overwrite the first download, so you need to rename the file.
# Curl-o #1 _ # 2.JPG http://www.doiido.com/?hello,bb#/dodo?1-5=.jpg
In this way, the download of the hello/dodo1.JPG file will become hello_dodo1.JPG, and other files will be pushed accordingly, effectively avoiding file overwriting.


8.4: multipart download
Sometimes the downloaded content is relatively large, so we can download it in multiple parts. Use built-in option:-r
# Curl-r 0-100-o dodow.part1.jpg http://www.doiido.com/dodo1.JPG
# Curl-r 100-200-o dodow.part2.jpg http://www.doiido.com/dodo1.JPG
# Curl-r 200--o dodow.part3.jpg http://www.doiido.com/dodo1.JPG
# Cat dodow.part *> dodo1.JPG
In this way, you can view the dodo1.JPG content.


8.5: download files through ftp
Curl can be used to download files through ftp. curl provides two types of syntax for downloading files from ftp
# Curl-O-u User name: Password ftp://www.doiido.com/dodo1.JPG
# Curl-O ftp: // User name: password @ www.doiido.com/dodo1.JPG


8.6: display the download progress bar
# Curl-#-O http://www.doiido.com/dodo1.JPG


8.7: the download progress is not displayed.
# Curl-s-O http://www.doiido.com/dodo1.JPG


9. resumable upload
In windows, we can use software such as thunder for resumable data transfer. Curl can achieve the same effect through built-in option:-C.
If the download process of dodo1.JPG suddenly drops, you can resume the download in the following ways:
# Curl-C-O http://www.doiido.com/dodo1.JPG


10. upload files
Curl can not only download files, but also upload files. Implemented through built-in option:-T
# Curl-T dododo1.jpg-u Username: Password ftp://www.doiido.com/img/
In this way, the file dodo1.JPG is uploaded to the ftp server.


11. Capture Error
# Curl-f http://www.doiido.com/error



Other parameters (translated as reprinted here ):
-A/-- append: When uploading a file, it is appended to the target file.
-- Anyauth can use the "any" authentication method
-- Basic verification with HTTP
-B/-- use-ascii use ASCII text transmission
-D/-- data <data> HTTP POST
-- Data-ascii <data> post data in ascii format
-- Data-binary <data> post data in binary mode
-- Negotiate uses HTTP Authentication
-- Digest uses digital authentication
-- Disable-eprt: disable EPRT or LPRT
-- Disable-epsv: disable the use of EPSV.
-- Egd-file <file>: Set the EGD socket path for random data (SSL)
-- Tcp-nodelay use the TCP_NODELAY Option
-E/-- cert <cert [: passwd]> client certificate file and password (SSL)
-- Cert-type <type> Certificate file type (DER/PEM/ENG) (SSL)
-- Key <key> private key file name (SSL)
-- Key-type <type> private key file type (DER/PEM/ENG) (SSL)
-- Pass <pass> private key and password (SSL)
-- Engine <eng> encryption engine usage (SSL). "-- engine list" for list
-- Cacert <file> CA certificate (SSL)
-- Capath <directory> CA directory (made using c_rehash) to verify peer against (SSL)
-- Ciphers <list> SSL Password
-- Compressed requires that the returned data be compressed (using deflate or gzip)
-- Connect-timeout <seconds>: sets the maximum request time.
-- Create-dirs: create the directory hierarchy of the local directory
-- Crlf upload converts LF into CRLF
-- Ftp-create-dirs: If the remote directory does not exist, create a remote directory.
-- Ftp-method [multicwd/nocwd/singlecwd] controls the use of CWD
-- Ftp-pasv uses PASV/EPSV to replace the port
-- When ftp-skip-pasv-ip uses PASV, ignore this ip address.
-- Ftp-ssl tries to use SSL/TLS for ftp Data Transmission
-- Ftp-ssl-reqd requires the use of SSL/TLS for ftp Data Transmission
-F/-- form <name = content> simulate http form submission data
-Form-string <name = string> simulate http form submission data
-G/-- globoff disables the use of URL sequences and ranges {} and []
-G/-- get sends data in get Mode
-H/-- help
-H/-- header <line> Custom header information is passed to the server
-- Length of HTTP header information ignored by ignore-content-length
-I/-- include: the protocol header information is included in the output.
-I/-- head: only show Document Information
-J/-- junk-session-cookies ignore session cookies when reading files
-- Interface <interface> use the specified network interface/address
-- Krb4 <level> Use krb4 of the specified security level
-K/-- insecure
-K/-- config: Read the specified configuration file
-L/-- list-only lists the names of objects in the ftp directory.
-- Limit-rate <rate> sets the transmission speed.
-- Local-port <NUM> force use of the local port number
-M/-- max-time <seconds>: set the maximum transmission time.
-- Max-redirs <num>: sets the maximum number of directories read.
-- Max-filesize <bytes>: sets the maximum number of downloaded files.
-M/-- manual display all manually
-N/-- netrc reads the user name and password from the netrc file
-- Netrc-optional overwrites-n using. netrc or URL
-- Ntlm uses http ntlm for authentication
-N/-- no-buffer disable buffer output
-P/-- proxytunnel use HTTP Proxy
-- Proxy-anyauth select any proxy authentication method
-- Proxy-basic: Use basic Authentication on the proxy
-- Proxy-digest use digital authentication on the proxy
-- Proxy-ntlm uses ntlm for authentication on the proxy
-P/-- ftp-port <address> uses the port address instead of PASV
-Q/-- quote <cmd> before file transfer, send the command to the server
-- Range-file: Random file Reading (SSL)
-R/-- remote-time: specifies the time when a local file is generated.
-- Retry <num> Number of retries when transmission fails
-- Retry-delay <seconds> when a transmission error occurs, set the retry Interval.
-- Retry-max-time <seconds> when transmission fails, set the maximum retry time.
-S/-- show-error Display error
-- Socks4 -- Socks5 -T/-- telnet-option <OPT = val> Telnet option settings
-- Trace <file> debug the specified file
-- Trace-ascii <file> Like -- trace but no hex output
-- Add timestamp when trace-time trace/Detailed output
-- Url <URL> Spet URL to work
-U/-- proxy-user <user [: password]> set the proxy user name and password
-V/-- version: displays version information.
-X/-- request <command> specifies the command
-Y/-- speed-time: the time required to discard the speed limit. The default value is 30.
-Y/-- speed-limit: the speed limit for stopping transmission. The speed is in seconds.
-Z/-- time-cond transfer time settings
-0/-- http1.0 use HTTP 1.0
-1/-- tlsv1 use TLSv1 (SSL)
-2/-- sslv2 uses SSLv2 (SSL)
-3/-- sslv3 used by SSLv3 (SSL)
-- 3p-quote like-Q for the source URL for 3rd party transfer
-- 3p-url: Use a url for third-party transmission
-- 3p-user uses the user name and password for third-party transmission
-4/-- use IP4 for ipv4
-6/-- use IP6 for ipv6


Curl commands in linux

Curl-I url is a response that only takes the server's response and does not contain any content, such

User @ minix-nb :~ $ Curl-I www.baidu.com
HTTP/1.1 200 OK
Date: Wed, 16 Sep 2009 11:16:23 GMT
Server: BWS/1.0.
Content-Length: 3509
Content-Type: text/html
Cache-Control: private
Expires: Wed, 16 Sep 2009 11:16:23 GMT
Set-Cookie: BAIDUID = 0ec3f02d099d83b4eda0c65e09a0000d6: FG = 1; expires = Wed, 16-Sep-39 11:16:23 GMT; path =/; domain = .baidu.com
P3P: CP = "oti dsp cor iva our ind com"

OK? :)

In linux, what is the difference between the telnet command and the curl command? What are their advantages and disadvantages?

Telnet is a tool that allows you to access other people's rooms (machines), where you can cook (edit, modify) or drink (view ).
Curl is different. You cannot access it, but you can remotely obtain what you need from a telescope.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.