Hadoop HDFS API Operations

Source: Internet
Author: User
Tags gtk

A simple introduction to the basic operation of the Hadoop HDFs APIHadoop provides us with a very handy shell command for HDFs (similar to commands for Linux file operations). Hadoop also provides us with HDFSAPI so that our developers can do something about Hfds. such as: Copy file (from local to HDFs, from HDFs to local), delete files or directories, read the contents of the file, see information about the file, list all subdirectories of the file, append content to the file. (Note: HDFs does not support modification of a row in a file, only append content to the back of the file).
First I initialize HDFs and finally the HDFs shuts down:
<span style= "White-space:pre" ></span>private static final String Hdfs_path = "hdfs://localhost:8020";  Private Configuration conf = null;private FileSystem fs = null, @Beforepublic void Beforeclass () throws IOException {conf = New Configuration (); fs = Filesystem.get (Uri.create (Hdfs_path), conf);} @Afterpublic void Afterclass () throws IOException {Fs.close ();}


from a local copy file to HDFs or from an HDFs copy file to a local
@Testpublic void Testcopylocalfiletohdfs () throws IOException {string[] args = {"/test.txt1", "hdfs://localhost:8020/ User/root/test.txt "};if (args.length! = 2) {System.err.println (" usage:filecopy <source> <target> "); System.exit (2);} InputStream in = new Bufferedinputstream (new FileInputStream (args[0)); FileSystem fs = Filesystem.get (Uri.create (args[1]), conf), outputstream out = Fs.create (new Path (args[1])); Outils.copybytes (in, Out, conf);//Fs.copyfromlocalfile (new//Path ("/eclipse-jee-luna-r-linux-gtk-x86_64.tar.gz"), new//path (hdfs_path+ "/user/root/")) Fs.copytolocalfile (New Path ("hdfs://localhost:8020/user/root/ Eclipse-jee-luna-r-linux-gtk-x86_64.tar.gz "), New Path ("/user/"));

Deleting Files
@Testpublic void DeleteFile () throws IOException {Fs.delete (New Path ("Hdfs://localhost:8020/user/root/out1"), true);}
read file to output stream
@Testpublic void ReadFile () {InputStream in = null;try {in = Fs.open (new Path (Hdfs_path + "/user/root/test.txt"); ioutils. Copybytes (In, System.out, conf);} catch (IOException e) {e.printstacktrace ();} finally {Ioutils.closestream (in);}}

get information about a file
@Testpublic void GetFileInfo () throws IllegalArgumentException, IOException {filestatus FSta = fs.getfilestatus (New Path (hdfs_path+ "/user/root/test.txt")); System.out.println (Fsta.getaccesstime ()); System.out.println (Fsta.getblocksize ()); System.out.println (Fsta.getmodificationtime ()); System.out.println (Fsta.getowner ()); System.out.println (Fsta.getgroup ()); System.out.println (Fsta.getlen ()); System.out.println (Fsta.getpath ()); System.out.println (Fsta.issymlink ());}

list all files under the directory
@Testpublic void ListFile () throws Filenotfoundexception,illegalargumentexception, IOException {remoteiterator< Locatedfilestatus> iterator = fs.listfiles (new Path (Hdfs_path + "/user/root/"), true); while (Iterator.hasnext ()) { System.out.println (Iterator.next ());} filestatus[] FSS = fs.liststatus (new Path (Hdfs_path + "/")); Path[] PS = fileutil.stat2paths (FSS); for (Path P:ps) {System.out.println (P);} Filestatus sta = fs.getfilestatus (New Path ("hdfs://localhost:8020/user/root/eclipse-jee-luna-r-linux-gtk-x86_64"). Tar.gz ")); blocklocation[] bls = fs.getfileblocklocations (STA, 0, Sta.getlen ()); for (Blocklocation B:bls) {for (String s:b.gettop Ologypaths ()) System.out.println (s); for (String s:b.gethosts ()) System.out.println (s);}

append something to the back of the file appendfirst we want to set HDFS support to append content to the fileJoin in Hdfs-site.xml
<property>        <name>dfs.support.append</name>        <value>true</value>   </ Property>
The code implementation is:
@Testpublic void AppendFile () {String Hdfs_path = "hdfs://localhost:8020/user/root/input/test.txt";//file path// Conf.setboolean ("Dfs.support.append", true); String Inpath = "/test.txt1"; try {//file stream to append, Inpath to file InputStream in = new Bufferedinputstream (New FileInputStream ( Inpath)); OutputStream out = Fs.append (new Path (Hdfs_path)); Ioutils.copybytes (in, out, 4096, true);} catch (IOException e) {e.printstacktrace ();}}






Hadoop HDFS API Operations

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.