1. Import the Hadoop jar package
Add the hadoop/share/common/directory, hadoop/share/common/lib/directory, hadoop/hdfs/directory, and the next jar package to eclipse.
2. Start Encoding Call
StaticFileSystem fs=NULL; Public Static voidMain (string[] args) throws Exception {//TODO auto-generated Method Stubinit (); Testupload (); } Public Static voidinit () throws exception{FS=filesystem.Get(NewURI ("hdfs://192.168.1.7:9000"),NewConfiguration (),"Hadoop"); } /** * Copy the local file into the HDFs file system * @throws Exception * @throws IOException*/ Public Static voidTestupload () throws Exception, ioexception{outputstream remote= Fs.create (NewPath ("/UPLOADJDK")); FileInputStream Local=NewFileInputStream ("C://jdk.rar"); Ioutils.copybytes (local, remote,4096,true); } /** * download files from the HDFs file system * @throws Exception * @throws IOException*/ Public voidTestdownload () throws Exception, ioexception{InputStreaminch= Fs.open (NewPath ("/eclipse-sdk-4.3.1-linux-gtk-x86_64.tar.gz")); OutputStream Output=NewFileOutputStream ("C://jdk2.rar"); Ioutils.copybytes (inch, Output,4096,true); }
The Testupload method is to upload the local "c://jdk.rar" file to the HDFs system root directory and name it uploadjdk.
The Testdownload method is to download the "eclipse-sdk-4.3.1-linux-gtk-x86_64.tar.gz" in the root directory of the HDFS systemto the C drive of this address and name it "Jdk2.rar"
It is worth noting that:hdfs://192.168.1.7:9000"Address is the second article" Ubuntu Hadoop 2.7.0 pseudo partial Installation "in /usr/local/hadoop/etc/hadoop The address configured in the/core-site.xml file. If the configuration is
"HDFs://localhost:9000" needs to be changed to the actual machine IP for normal access.
Hadoop Learning (iv) Java operation HDFs