Curl Command Usage Encyclopedia

Source: Internet
Author: User
Tags html form
This article is http://www.51osos.com/a/Linux_CentOS_RedHat/Linuxjichu/2010/1025/curl.html and http://hi.baidu.com/yschen0925/ Summary of the blog/item/d62851072f10eaca7b894790.html.

Today 51 open Source to give you a detailed explanation curl this command.

Can be viewed as a command line browser

1. Open gzip Request
Curl-i http://www.sina.com.cn/-H accept-encoding:gzip,defalte

2, monitoring the Web page response time
Curl-o/dev/null-s-W "time_connect:%{time_connect}\ntime_starttransfer:%{time_starttransfer}\ntime_total:%{time_ total}\n "" Http://www.kklinux.com "

3. Monitor the availability of the site
Curl-o/dev/null-s-W%{http_code} "http://www.kklinux.com"

4, the HTTP1.0 protocol request (default is http1.1)
Curl-0 ........
1 Read the Web page
$ Curl Linuxidc.com ">http://www.linuxidc.com
2) Save the Web page
$ Curl http://www.linuxidc.com > page.html $ curl-o page.html http://www.linuxidc.com
3 Proxy server and port used:-X
$ curl-x 123.45.67.89:1080-o page.html http://www.linuxidc.com
4 use cookies to record session information
$ curl-x 123.45.67.89:1080-o page.html-d cookie0001.txt http://www.linuxidc.com
Option:-D saves the cookie information in the HTTP response to a special file, so that when the page is saved to the page.html, the cookie information is stored in the cookie0001.txt.
5 So, how do you continue to use the cookie information that you left last time you visit?
Use option to append the last cookie information to the HTTP request:-B
$ curl-x 123.45.67.89:1080-o page1.html-d cookie0002.txt-b cookie0001.txt http://www.linuxidc.com

6) Browser Information ~ ~ ~ ~
Optionally specify your own browser information as stated in this visit:-A
Curl-a "mozilla/4.0" (compatible; MSIE 6.0; Windows NT 5.0) "-X 123.45.67.89:1080-o page.html-d cookie0001.txt http://www.yahoo.com
In this way, server-side access to the requirements, will think you are a running on Windows 2000 IE6.0, hey heh, in fact maybe you use the Apple machine.
and "mozilla/4.73 [en] (X11; U Linux 2.2; i686 "You can tell each other that you are running on a PC Linux, with Netscape 4.73, oh oh

7)
Another restrictive method commonly used on the server side is to check the Referer of HTTP access. For example, you first visit the home page, and then visit the specified download pages, this second visit to the Referer address is the first access to the successful page
Access. In this way, as long as the server to find a visit to the download page of the Referer address is not the home page address, you can conclude that it is a stolen company ~~~~~
Hate hate ~ ~ ~ I just want to steal even ~~~~~.
Luckily curl provided us with the option of setting Referer:-E
Curl-a "mozilla/4.0" (compatible; MSIE 6.0; Windows NT 5.0) "-X 123.45.67.89:1080-e" mail.yahoo.com "-O page.html-d cookie0001.txt http://www.yahoo.com
In this way, you can cheat each other's servers, you are from the mail.yahoo.com click on a chain to take over, oh oh

8) Curl Download file
Just now, download the page to a file, you can use-O, the download file is the same.
For example, Curl-o 1.jpg Http://cgi2.tky.3web.ne.jp/~zzh/screen1.JPG
Here teaches a new option:-O
Uppercase O, so used: Curl-o Http://cgi2.tky.3web.ne.jp/~zzh/screen1.JPG
In this way, you can automatically exist locally by the name of the file on the server.
Another one for better use.
If Screen1. Besides JPG, there are screen2.jpg and Screen3. JPG 、....、 Screen10. JPG need to download, it is difficult to let us write a script to complete these operations.
Don't do it.
In the curl, it's OK to write this:
Curl-o http://cgi2.tky.3web.ne.jp/~zzh/screen[1-10]. Jpg
Oh, wow, great. ~~~
9)
Again, we continue to explain the download.
Curl-o http://cgi2.tky.3web.ne.jp/~{zzh,nick}/[001-201]. Jpg
This produces the download, which is
~zzh/001.jpg
~zzh/002.jpg
...
~zzh/201.jpg
~nick/001.jpg
~nick/002.jpg
...
~nick/201.jpg
It's convenient enough. Ha ha
Hey. It's too early to be happy.
Because Zzh/nick under the filename are 001,002...,201, download the file name, the back of the previous file to cover off ~ ~ ~
It doesn't matter, we have more ruthless.
Curl-o #2_ #1.jpg http://cgi2.tky.3web.ne.jp/~{zzh,nick}/[001-201]. Jpg
--This is ... Download the custom file name.
--Right, hehe.
#1是变量, this part of {Zzh,nick}, first value Zzh, second value Nick
#2代表的变量, the second variable part is---[001-201], and the value is added from 001 to 201.
Thus, customizing the downloaded file name becomes this:
Original: ~zzh/001.jpg---> Download: 001-zzh. Jpg
Original: ~nick/001.jpg---> Download: 001-nick. Jpg
In this way, it is not afraid of the file name, hehe

9)
Keep talking about downloads
We usually on the Windows platform, flashget such tools can help us to separate block parallel download, can also be disconnected.
Curl in these aspects also not lose to who, hehe
For example, we download screen1. JPG, suddenly dropped the line, we can begin to continue the transmission
Curl-c-O Http://cgi2.tky.3wb.ne.jp/~zzh/screen1.JPG
Of course, you do not take a flashget download half of the file to fool me ~ ~ ~ ~ ~ Other download software half of the file can not be used Oh ~ ~ ~
Block to download, we use this option on it:-R
Give an example to explain
For example, we have a Http://cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 to download (Miss Zhao's phone recitation:D )
We can use this command:
Curl-r 0-10240-o "Zhao.part1" Http:/cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 &\
Curl-r 10241-20480-o "Zhao.part1" Http:/cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 &\
Curl-r 20481-40960-o "Zhao.part1" Http:/cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 &\
Curl-r 40961--O "Zhao.part1" Http:/cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3
This can be divided into pieces to download.
But you need to merge these broken files yourself.
If you use a UNIX or an apple, use cat zhao.part* > Zhao.mp3 to
If the use of Windows, with copy/b to solve it, hehe
The above is all HTTP protocol download, in fact, FTP can also be used.
Usage,
Curl-u name:passwd Ftp://ip:port/path/file
Or you know it.
Curl Ftp://name:passwd@ip:port/path/file

10 the option to upload is-t
For example, we send a file to ftp: curl-t localfile-u name:passwd ftp://upload_site:port/path/
Of course, uploading files to an HTTP server can also
Like Curl-t LocalFile http://cgi2.tky.3web.ne.jp/~zzh/abc.cgi.
Note that this time, the protocol used is the HTTP put method
Just said put, hey, naturally let old clothes think up some other methos haven't said yet.
Get and post can not forget oh.
HTTP submits a form, more commonly used is post mode and get mode
Get mode no option, just write the variable in the URL.
Like what:
Curl http://www.yahoo.com/login.cgi?user=nickwolfe&password=12345
and option for post mode is-D
For example, curl-d "user=nickwolfe&password=12345" http://www.yahoo.com/login.cgi
is equivalent to sending a landing application to this site ~~~~~
Whether to use Get mode or post mode, look at the opposite server's program settings.
One thing to note is that file uploads on the file in post mode, such as
<form method= "POST" enctype= "Multipar/form-data" action= "http://cgi2.tky.3web.ne.jp/~zzh/up_file.cgi" >
<input Type=file name=upload>
<input type=submit name=nick value= "Go" >
</form>
Such an HTTP form, we want to use curl to simulate, this is the syntax:
Curl-f upload= @localfile-F Nick=go http://cgi2.tky.3web.ne.jp/~zzh/up_file.cgi
RR told so much, in fact, curl still have a lot of skills and usage
Like using a local certificate for HTTPS, you can do this
CURL-E Localcert.pem Https://remote_server
For example, you can also use curl to look up a dictionary through the Dict protocol ~~~~~
Curl Dict://dict.org/d:computer

Today in order to check all the Hedgehog host all domain names are on file. In the case of using wget, the Curl command line Flow command was found. The call to post was found to be fine. Particularly conducive to the submission of information and change
More parameters than the test. It is very useful for me to have the record information for hundreds of thousands of domain names to miibeian.gov.cn. Found this article is very good, specially for the paste.
My goal:
Curl-d "cxfs=1&ym=xieyy.cn" Http://www.miibeian.gov.cn/baxx_cx_servlet
Filter out the information, extract the record number information, and set an identity bit. Put the domain name, the record number and the identity bit into storage

With the Curl command, post submits data with space
Today, I came across a situation, I want to use Curl login a Web page, accidentally found to post the data with space. For example, the user name is "abcdef", the password is "ABC Def", which has a space, according to my previous way to submit:
Curl-d cookie-d "Username=abcdef&password=abc def" http://login.xxx.com/prompted login failure.

So look at the Curl Manual man curl. Found it:
D/--data (HTTP) sends the SPECI?ED data in a POST request to the HTTP server, in a way can emulate as if a user has? Lled in a HTML form and pressed the
Submit button. Note this is sent exactly as speci?ed with no extra processing. The data is expected "url-encoded".
This would cause curl to pass the the the data to the server using the Content-type application/x-www-form-urlencoded. Compare To-f/--form. If This option is used
More than once on the same command line, the data pieces speci?ed would be merged together with a separating &-letter. Thus, using '-D name=daniel-d
Skill=lousy ' would generate a post chunk that looks like ' name=daniel&skill=lousy '.
So instead:
curl-d cookie-d "Username=abcdef"-D "password=abc EFG" http://login.xxx.com/so that it can be successfully logged in. (Responsible editor: The Night of the day)

Curl is the next most powerful HTTP command-line tool in Linux, and its functionality is powerful.

1) Apart, let's start here.

$ Curl Http://www.linuxidc.com

After the carriage return, the www.linuxidc.com HTML is splinters on the screen.

2 Well, if you want to save the read page, is this the case?

$ Curl http://www.linuxidc.com > page.html

Sure, but not in such trouble.

With Curl's built-in option, save the HTTP results with this option:-O

$ Curl-o page.html http://www.linuxidc.com

This way, you can see a download page progress indicator on the screen. When we get to 100%, it's OK.

3) What what. cannot be accessed. It must be your proxy is not set up.

When using curl, use this option to specify the proxy server and its ports that HTTP Access uses:-X

$ curl-x 123.45.67.89:1080-o page.html http://www.linuxidc.com

4 It is annoying to visit some websites, he uses cookies to record session information.

Browsers like Ie/nn, of course, can easily handle cookie information, but our curl. .....

Let's learn this option:-D <-This is to save the cookie information in the HTTP response to a special file.

$ curl-x 123.45.67.89:1080-o page.html-d cookie0001.txt http://www.linuxidc.com

In this way, when the page is saved to the page.html, the cookie information is also stored in the cookie0001.txt.

5 So, how do you continue to use the cookie information that you left last time you visit? You know, many sites rely on your cookie information to determine if you're not doing the right things to access their sites.

This time we use this option to append the last cookie information to the HTTP request:-B

$ curl-x 123.45.67.89:1080-o page1.html-d cookie0002.txt-b cookie0001.txt http://www.linuxidc.com

In this way, we can almost simulate all the IE operations, to access the Web page.

6) A little wait ~ I seem to forget something ~

That's right. is browser information

Some annoying sites always want us to use certain browsers to access them, and sometimes even more so, to use some specific version of NND, where there is time for it to find these weird browsers.

Fortunately, Curl provides us with a useful option that allows us to arbitrarily specify our own browser information as stated in our visit:-A

$ curl-a "mozilla/4.0" (compatible; MSIE 6.0; Windows NT 5.0) "-X 123.45.67.89:1080-o page.html-d cookie0001.txt http://www.linuxidc.com

In this way, server-side access to the requirements, will think you are a running on Windows 2000 IE6.0, hey heh, in fact maybe you use the Apple machine.

and "mozilla/4.73 [en] (X11; U Linux 2.2; i686 "You can tell each other that you are running on a PC Linux, with Netscape 4.73, oh oh

7 Another restrictive method commonly used on the server side is to check the Referer of HTTP access. For example, you first visit the first page, and then visit the specified download page, this second visit to the Referer address is the first successful access to the page address. In this way, the server side as long as the discovery of the download page to visit the Referer address is not the home page address, you can conclude that it is a stolen company ~

Hate hate ~ I just want to steal even ~.

Luckily curl provided us with the option of setting Referer:-E

$ curl-a "mozilla/4.0" (compatible; MSIE 6.0; Windows NT 5.0) "-X 123.45.67.89:1080-e" mail.linuxidc.com "-O page.html-d cookie0001.txt http://www.linuxidc.com

In this way, you can cheat each other's servers, you are from the mail.linuxidc.com click on a chain to take over, oh oh

8 wrote that there was something important missing. ——-Download files using Curl

Just now, download the page to a file, you can use-O, the download file is the same. Like what

$ Curl-o 1.jpg Http://cgi2.tky.3web.ne.jp/~zzh/screen1.JPG

Here's a new option:-------------------Uppercase O

$ Curl-o Http://cgi2.tky.3web.ne.jp/~zzh/screen1.JPG

In this way, you can automatically exist locally by the name of the file on the server.

Another one for better use.

If Screen1. Besides JPG, there are screen2.jpg and Screen3. JPG 、....、 Screen10. JPG need to download, it is difficult to let us write a script to complete these operations.

Don't do it.

In the curl, it's OK to write this:

$ Curl-o http://cgi2.tky.3web.ne.jp/~zzh/screen[1-10]. Jpg

Oh, wow, great. ~

9) Again, we continue to explain the download.

$ Curl-o http://cgi2.tky.3web.ne.jp/~{zzh,nick}/[001-201]. Jpg

This produces the download, which is

~zzh/001.jpg

~zzh/002.jpg

...

~zzh/201.jpg

~nick/001.jpg

~nick/002.jpg

...

~nick/201.jpg

It's convenient enough. Ha ha

Hey. It's too early to be happy.

Because Zzh/nick under the filename are 001,002...,201, download the file name, the back of the previous file to cover off ~

It doesn't matter, we have more ruthless.

$ Curl-o #2_ #1.jpg http://cgi2.tky.3web.ne.jp/~{zzh,nick}/[001-201]. Jpg

-This is ... Download the custom file name. -Aye, huh.

In this way, customized to download the file name, it becomes this: the original: ~zzh/001.jpg--> download: 001-zzh. JPG Original: ~nick/001.jpg--> download: 001-nick. Jpg

In this way, it is not afraid of the file name, hehe

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.