Linux Curl Command parameter details (6/23)

Source: Internet
Author: User
Tags send cookies



Linux Curl is a tool for uploading or downloading files under the command line via URL syntax, which supports a variety of protocols such as http,https,ftp,ftps,telnet, which are often used to crawl Web pages and monitor the status of Web Servers. In linux, Curl is a file transfer tool that works under the command line using URL rules, which can be said to be a powerful HTTP command-line tool. It supports file upload and download, is a comprehensive transfer tool, but by tradition, the custom called URL for download Tool.


-a/--user-agent <string>              Set user agent to send to Server-b/--cookie <name=string/file>    cookie string or file read location-c/-- Cookie-jar <file>                    Write the cookie into this file after the Operation-c/--continue-at <offset>            breakpoint Continues-d/--dump-header <file >              writes header information to the File-e/--referer                                  source Url-f/--fail                                          does not display an HTTP error when the connection Fails-o/--output                                  writes the output to the file-o/-- Remote-name                      writes the output to the file, retains the filename of the remote File-r/--range <range>                      retrieves the byte range-s/--silent mute mode from the http/1.1 or FTP server                                    . Do not output anything-t/--upload-file <file>                  upload files-u/--user <user[:p assword]>      Set the server user and Password-w/--write-out [ format]                What output is Completed-x/--proxy




Example:

1. Basic Usage





# Curl Http://www.linux.com





Once executed, the www.linux.com HTML will be displayed on the Screen.

Ps: since the installation of Linux is often not installed on the desktop, also means that there is no browser, so this method is frequently used to test whether a server can reach a website






2. Save the visited Web page

2.1: save using the redirection feature of Linux





# Curl Http://www.linux.com >> linux.html





2.2: You can use Curl's built-in option:-o (lowercase) to save Web pages





$ Curl-o linux.html http://www.linux.com





After execution, the following screen appears, displaying 100% indicating a successful save





% Total    % Received% xferd Average Speed time time current                                dload  Upload  Total  spent    left  Speed100 79684    0 79684    0    0  3437k      0--:--:----:---:----:--: --7781k





2.3: You can use Curl's built-in Option:-O (uppercase) to save files in a Web page

Note that the URL behind this is specific to a file, or you can't catch it.





# Curl-o Http://www.linux.com/hello.sh





3. Test page return value





# curl-o/dev/null-s-w%{http_code} Www.linux.com





Ps: in the script, it is very common to test whether the website is in normal use






4. Specify proxy server and its port

Most of the time the Internet needs to use a proxy server (such as using a proxy server to surf the Internet or because of the use of curl other people's websites to block the IP address), fortunately, curl by using the built-in Option:-x to support the setup agent





# curl-x 192.168.100.100:1080 http://www.linux.com





5. Cookies

Some websites use cookies to record session Information. Cookies can be easily processed by browsers such as chrome, but it is easy to process cookies in curl by adding relevant Parameters.

5.1: Save the cookie information inside the HTTP Response. Built-in Option:-c (lowercase)





# curl-c Cookiec.txt  http://www.linux.com





Cookie information is stored in cookiec.txt after Execution.






5.2: keep the header information inside the HTTP Response. Built-in option:-d





# curl-d Cookied.txt http://www.linux.com





Cookie information is stored in cookied.txt after Execution.






Note: the cookie generated by-c (lowercase) and the-d cookie are not the Same.







5.3: Using cookies

Many websites monitor your cookie information to see if you have access to their website as a rule, so we need to use stored cookie Information. Built-in Option:-b





# curl-b Cookiec.txt http://www.linux.com





6, imitate the browser

Some websites need to use a specific browser to access them, and some need to use certain specific versions. Curl built-in OPTION:-A allows us to specify a browser to access the site





# curl-a "mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0) "http://www.linux.com





So the server side will be considered to be accessed using IE8.0.






7, Forged Referer (hotlinking)

Many servers check HTTP access to Referer to control access. For example: you first visit the homepage, and then visit the home page of the mailbox, where access to the mailbox Referer address is to visit the page after the successful homepage address, if the server found on the mailbox page access to the Referer address is not the home address, it is concluded that it is a stolen

The built-in OPTION:-E in curl allows us to set Referer





# CURL-E "www.linux.com" http://mail.linux.com





This will make the server think that you have clicked a link from www.linux.com.






8. Download the file

8.1: Download the file using Curl.

#使用内置option:-o (lowercase)





# Curl-o Dodo1.jpg http:www.linux.com/dodo1.jpg





#使用内置option:-O (uppercase)





# Curl-o Http://www.linux.com/dodo1.JPG





This will save the file on the server name to the local






8.2: cyclic download

Sometimes download pictures can be the previous part of the name is the same, the last tailbone name is not the same





# Curl-o Http://www.linux.com/dodo[1-5]. Jpg





This will save the Dodo1,dodo2,dodo3,dodo4,dodo5 All.






8.3: under load name





# Curl-o Http://www.linux.com/{hello,bb}/dodo[1-5]. Jpg





Because the downloaded hello and the file name in the BB are dodo1,dodo2,dodo3,dodo4,dodo5. So the second download will overwrite the first download, so you need to rename the File.





# Curl-o #1_ #2.jpg http://www.linux.com/{hello,bb}/dodo[1-5]. Jpg





This is hello/dodo1. JPG files will become hello_dodo1 when Downloaded. JPG, other files and so on, which effectively avoids overwriting the file






8.4: sub-block Download

Sometimes the download of things will be relatively large, this time we can download sections. Using the built-in Option:-r





# Curl-r 0-100-o dodo1_part1. JPG http://www.linux.com/dodo1.JPG# curl-r 100-200-o dodo1_part2. JPG http://www.linux.com/dodo1.JPG# curl-r 200--o dodo1_part3. JPG http://www.linux.com/dodo1.JPG# cat dodo1_part* > dodo1. Jpg





This will allow you to view the dodo1. Jpg's Content.






8.5: download files via FTP

Curl can download files via ftp, Curl provides two kinds of syntax to download from FTP





# curl-o-u User name: password ftp://www.linux.com/dodo1.JPG# curl-o ftp://user name: password @www.linux.com/dodo1.jpg





8.6: Show Download progress bar





# Curl-#-o Http://www.linux.com/dodo1.JPG





8.7: download Progress information is not displayed





# Curl-s-o Http://www.linux.com/dodo1.JPG





9, The breakpoint continues to pass

In windows, we can use software such as Thunderbolt to continue the Breakpoint. Curl can also achieve the same effect with built-in option:-c

If you are downloading dodo1. JPG in the process of a sudden drop off, you can use the following ways to continue to pass





# Curl-c-o Http://www.linux.com/dodo1.JPG





10. Uploading Files

Curl can not only download files, but also upload files. With built-in option:-t to achieve





# curl-t dodo1. Jpg-u User name: Password ftp://www.linux.com/img/





This uploads the file Dodo1 to the FTP server. Jpg






11. Display Crawl Error





# curl-f Http://www.linux.com/error



1. Linux Curl Crawl Web page: crawl baidu: Curl http://www.baidu.com If found garbled, can use Iconv transcoding curl http://iframe.ip138.com/ic.asp|iconv- Fgb2312iconv see: handling text files with the Iconv command under the Linux/unix system Problem 2. Linux Curl uses proxy: Linux Curl uses HTTP proxy crawl page: curl-x 111.95.243.36:80 http://iframe.ip138.com/ic.asp|iconv-fgb2312curl-x 111.95.243.36:80-u Aiezu:password http://www.baidu.com using socks agent crawl page: curl--socks4 202.113.65.229:443/http/ IFRAME.IP138.COM/IC.ASP|ICONV-FGB2312CURL--SOCKS5 202.113.65.229:443 http://iframe.ip138.com/ic.asp|iconv- The fgb2312 proxy server address can be obtained from the crawler Agent. 3. Linux Curl Processing cookies receive cookies:curl-c/tmp/cookies http://www.baidu.com #cookies保存到/tmp/cookies file Send Cookies:curl-b " key1=val1;key2=val2; "http://www.baidu.com #发送cookies文本curl-b/tmp/cookies http://www.baidu.com #从文件中读取cookies4. Linux Curl send data: Linux curl get way submit Data: curl-g-d "name=value&name2=value2" http://www.baidu.com Linux Curl Post mode submission Data: curl-d "name=value&name2=value2" http://www.baidu.com #post数据curl-d a=b&c=d&[email  Protected]/tmp/txtHttp://www.baidu.com #post文件 uploading files as a form: curl-f [email protected]/tmp/me.txt http://www.aiezu.com Equivalent to setting the form form method= "POST" and enctype= ' Multipart/form-data ' two Properties. 5. Linux Curl HTTP Header processing: set HTTP request header information: curl-a "mozilla/5.0 firefox/21.0" http://www.baidu.com # Set the HTTP request header user-agentcurl-e "http://pachong.org/" http://www.baidu.com #设置http请求头Referercurl-h "Connection: keep-alive \ user-agent:mozilla/5.0 "http://www.aiezu.com Set HTTP response header processing: curl-i http://www.aiezu.com #仅仅返回headercurl-d /tmp/header http://www.aiezu.com #将http Header saved To/tmp/header file 6. Linux Curl certification: curl-u aiezu:password http://www.aiezu.com #用户名密码认证curl-e mycert.pem https://www.baidu.com #采用证书认证 6. Other: curl-# http://www.baidu.com #以 "#" number output progress bar Curl-o/tmp/aiezu http://www.baidu.com #保存http响应到/tmp/aiezulinux Small lesson with Curl: the URL of the HTTP request address is to be enclosed in "". Errors may occur when there are multiple parameters used & Connected.


9, Forged Source address, Some website will judge, request source Address. [[email protected] mytest]# curl-e http://localhost http://blog.51yip.com/wp-login.php [[email protected] mytest]# curl-e http://localhost http://blog.51yip.com/wp-login.php 10, When we often use curl to engage in People's things, people will put your IP to shield off, this time, We can use proxies [[email protected] mytest]# curl-x 24.10.28.84:32779-o home.html http://blog.51yip.com [[email prot ected] mytest]# curl-x 24.10.28.84:32779-o home.html http://blog.51yip.com    11, something bigger than that we can download in sections [[ email protected] mytest]# curl-r 0-100-o img.part1 http://blog.51yip.com/wp-content/uploads/2010/09/compare_ varnish.jpg12, download Progress information is not displayed [[email protected] mytest]# curl-s-o aaa.jpg   13, show download progress bar [[email protected] mytest]# curl-#-o   ######################################################################## 100% 14, Download file via FTP [[email protected] ~]$ curl-u username: password-o http://blog.51yip.com/demo/curtain/bbstudy_files/style.css [[ email protected] ~]$ Curl-o FTP://user name: password @ip:port/demo/curtain/bbstudy_files/style.css 15, upload via ftp [[email protected] ~]$ curl-t test.sql ftp:// username: password @ip:port/demo/curtain/bbstudy_files/[[email protected] ~]$ curl-t test.sql ftp://user name: password @ip:port/demo/ curtain/bbstudy_files/



Linux Curl Command parameters in detail (6/23)



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.