First, hadoop's WebHDFS supports access to HDFS through the rest api through http. Link: http://hadoop.apache.org/common/docs/current/hadoop-yarn/hadoop-yarn-site/WebHDFS.html
You can perform many operations through the rest api, such as uploading and downloading, viewing files, and creating directories. the local hadoop version is hadoop2.0 with httpfs installed and the port is 14000. here, the Upload File (the original File is Create and Write to a File) is used as an example.
You can upload files in two steps,
The first step is to submit a put request that does not automatically redirect and does not send file data. For example, I want to upload the file test.txt to the user directory. Run the following command: curl-I-X PUT "http: // 10.20.18.1: 14000/webhdfs/v1/user/test.txt? User. name = hdfs & op = CREATE "[& overwrite = <true | false>] [& blocksize = <LONG>] [& replication = <SHORT>]
[& Permission = <OCTAL>] [& buffersize = <INT>] "// The following is an optional parameter. overwrite indicates whether to overwrite the data. The default value is false, however, in the official documentation, true is a Valid Values (dizzy). // you can see that the block size, number of copies, and file permissions can be set. Some results will be returned after execution, as follows: HTTP/1.1 307 Temporary Redirect Server: Apache-Coyote/1.1 Set-Cookie: hadoop. auth = "u = hdfs & p = hdfs & t = simple & e = 1345220043113 & s = ikU/wiUsFtaTHrkPQmaya5PHjkQ ="; Version = 1; Path =/Location: http: // 10.20.18.1: 14000/webhdfs/v1/user/test.txt? Op = CREATE & user. name = hdfs & data = true Content-Type: application/json Content-Length: 0 Date: Fri, 17 Aug 2012 06:14:03 GMT // note that the returned value is 307
Write down the content of Set-Cookie and Location in headers.
For example, I write the content of set-cookie.txt to cookie.txt, and write locationto url.txt.
Step 2: Upload the file test.txt: curl-I-x put-T test.txt-B cookie.txt -- header "Content-Type: application/octet-stream "" 'cat url.txt '"Now you can see the file you uploaded on hdfs. Of course, calling the hadoop-httpfsde restAPI in the command line is definitely not the intention of httpfs.
Xmlhttpresquest is used to upload files. As an example. For the complete code, see the attachment. Create abc.html under/usr/lib/hadoop-httpfs/webapps/rootto access the httpfs port.
- This. xhr. Open (P. method, P. url, true );
- This. xhr. setRequestHeader ("Content-Type", "application/octet-stream"); // you can specify the Content-Type attribute.
- This. xhr. onreadystatechange = function (){
- If (this. xhr. readystate! = 4) {return ;}
- }. BIND (this );
- This. xhr. Send (null); // send the file. null is used here.
Here this. xhr is an xmlhttpresquest object. Before this. xhr. Send (null);, xmlhttpresquest
The cookie and Location redirection has been completed (note that 307 appears above, xmlhttpresqust will redirect requests that return 307 values when sending ).
This article from the "Porridge eat not hungry" blog, please be sure to keep this source http://chcearth.blog.51cto.com/2179839/965704