The Java Client for HDFs is written

Source: Internet
Author: User
Tags xsl

The shell operation of HDFs is simple, you can view the document directly, and similar to the Linux instructions, the following is a brief summary of HDFs Java client writing.

Build the project where the client is placed under the HDFs package:

A guide package is required, and a different jar package will be found under the Share folder in Hadoop. I put my stickers out for your reference:

The rest is to write the client code. There are not so many problems writing under Linux, but there are various errors under Windows, as mentioned below.

First look at the Core-site.xml file:

<?XML version= "1.0" encoding= "UTF-8"?><?xml-stylesheet type= "text/xsl" href= "configuration.xsl "?><!--Licensed under the Apache License, Version 2.0 (the "License"); Th the License. Obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 unless required by applicable l AW or agreed to writing, software distributed under the License are distributed on a "as is" BASIS, without WARRANT IES or CONDITIONS of any KIND, either express OR implied.     See the License for the specific language governing permissions and limitations under the License. See accompanying LICENSE file.  -<!--Put Site-specific property overrides in this file. -<Configuration>    <!--Specifies the file system schema (URI) used by Hadoop, the address of the boss of HDFs (NameNode) -    < Property>        <name>Fs.defaultfs</name>        <value>hdfs://192.168.230.134:9000</value>    </ Property>    <!--Specify the storage directory where the Hadoop runtime produces files -    < Property>        <name>Hadoop.tmp.dir</name>        <value>/home/hadoop/app/hadoop-2.4.1/data</value>    </ Property></Configuration>

Then take a look at the Hdfs-site.xml file:

<?XML version= "1.0" encoding= "UTF-8"?><?xml-stylesheet type= "text/xsl" href= "configuration.xsl "?><!--Licensed under the Apache License, Version 2.0 (the "License"); Th the License. Obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 unless required by applicable l AW or agreed to writing, software distributed under the License are distributed on a "as is" BASIS, without WARRANT IES or CONDITIONS of any KIND, either express OR implied.     See the License for the specific language governing permissions and limitations under the License. See accompanying LICENSE file.  -<!--Put Site-specific property overrides in this file. -<Configuration>    < Property>        <name>Dfs.replication</name>        <value>1</value>    </ Property></Configuration>

Write a tool class below, Hdfsutils.java:

 PackageCn.darrenchan.hadoop.hdfs;ImportJava.io.FileInputStream;Importjava.io.FileNotFoundException;Importjava.io.IOException;ImportJava.net.URI;Importorg.apache.commons.io.IOUtils;Importorg.apache.hadoop.conf.Configuration;ImportOrg.apache.hadoop.fs.FSDataOutputStream;ImportOrg.apache.hadoop.fs.FileStatus;ImportOrg.apache.hadoop.fs.FileSystem;ImportOrg.apache.hadoop.fs.LocatedFileStatus;ImportOrg.apache.hadoop.fs.Path;ImportOrg.apache.hadoop.fs.RemoteIterator;ImportOrg.junit.Before;Importorg.junit.Test;/*** Add-dhadoop_user_name=hadoop in Run As->run * configuration->arguments If the user is not Hadoop and then makes an error. Another ultimate solution is to filesystem.get the user's designation in the init () method *@authorChenchi **/ Public classHdfsutils {PrivateFileSystem FS; /*** Plus @before will be executed before the test method * *@throwsException*/@Before Public voidInit ()throwsException {//reads the xxx-site.xml configuration file under Classpath and parses its contents into a Conf objectConfiguration conf =NewConfiguration (); //If you put the configuration file under SRC, this line of code can be used without writing//You can also manually set the configuration information in the Conf in your code, overwriting the read value in the configuration fileConf.set ("Fs.defaultfs", "hdfs://192.168.230.134:9000"); //This line is only used when running under Windows.System.setproperty ("Hadoop.home.dir", "e:\\ Big Data tutorial HADOOP8 days \\hadoop-2.4.1"); //to obtain a client action instance object for a specific file system, based on the configuration informationFS = Filesystem.get (NewURI ("hdfs://weekend110:9000"), conf, "Hadoop"); }    /*** Upload file (bottom) * *@throwsIOException*/@Test Public voidUpload ()throwsIOException {Path DST=NewPath ("Hdfs://192.168.230.134:9000/aa/qingshu.txt"); Fsdataoutputstream out=fs.create (DST); FileInputStream in=NewFileInputStream ("E:/qingshu.txt");    Ioutils.copy (in, out); }    /*** Upload file (simple version) * *@throwsIOException*/@Test Public voidUpload2 ()throwsIOException {fs.copyfromlocalfile (NewPath ("E:/qingshu.txt"),NewPath ("Hdfs://weekend110:9000/aaa/bbb/ccc/qingshu100.txt")); }    /*** Download File * *@throwsIOException *@throwsillegalargumentexception*/@Test Public voidDownload ()throwsIllegalArgumentException, IOException {//The above is the error that this writing will report null pointer exceptions, so use the following notation//Fs.copytolocalfile, New Path ("HDFs://weekend110:9000/aa/qingshu.txt "),//new Path ("E:/chen.txt"));Fs.copytolocalfile (false,NewPath ("Hdfs://weekend110:9000/aa/qingshu.txt"),NewPath ("E:/chen.txt"),true); }    /*** View file information * *@throwsIOException *@throwsIllegalArgumentException *@throwsFileNotFoundException*/@Test Public voidListfiles ()throwsFileNotFoundException, IllegalArgumentException, IOException {//Listfiles Lists the file information, and provides recursive traversalremoteiterator<locatedfilestatus> files = fs.listfiles (NewPath ("/"),                true);  while(Files.hasnext ()) {Locatedfilestatus file=Files.next (); Path FilePath=File.getpath (); String FileName=Filepath.getname ();        System.out.println (FileName); } System.out.println ("---------------------------------"); //Liststatus can list information about files and folders, but does not provide self-recursive traversalfilestatus[] Liststatus = Fs.liststatus (NewPath ("/"));  for(Filestatus status:liststatus) {String name=Status.getpath (). GetName (); SYSTEM.OUT.PRINTLN (Name+ (Status.isdirectory ()? "Directory": "File")); }    }    /*** Create folder * *@throwsIOException *@throwsillegalargumentexception*/@Test Public voidmkdir ()throwsIllegalArgumentException, IOException {//It can also be abbreviated as/AAA/BBB/CCC .Fs.mkdirs (NewPath ("HDFS://WEEKEND110:9000/AAA/BBB/CCC")); }    /*** Delete files or folders * *@throwsIOException *@throwsillegalargumentexception*/@Test Public voidRM ()throwsIllegalArgumentException, IOException {//recursive Delete, path name can be abbreviatedFs.delete (NewPath ("/aa"),true); //fs.rename (path, path1);//This is the move file to a different folder    }}

Here is a summary of the two errors encountered:

1. You can add-dhadoop_user_name=hadoop to the run As->run configuration->arguments if the user is not Hadoop and then error is addressed.

Another ultimate solution is to filesystem.get the user's designation in the Init () method, which is

2. Running under Windows, may error: Could not locate executable null\bin\winutils.exe in the Hadoop binaries ...

Obviously, it should be a hadoop_home problem. If Hadoop_home is empty, it must be fullexename to Null\bin\winutils.exe. The workaround is simple, configure the environment variables, do not want to restart the computer can be added in the program:

Note: e:\\ Big Data tutorial HADOOP8 Day \\hadoop-2.4.1 is the path to the Hadoop that I unzipped natively.

You may still get the same error later, because you go into your Hadoop-x.x.x/bin directory and you will find that you are not winutils.exe this thing at all.

So I tell you, you can go to GitHub to download A, the Earth people know the address to send you one.

Address: Https://github.com/srccodes/hadoop-common-2.2.0-bin

After downloading, add Winutils.exe to your hadoop-x.x.x/bin.

The Java Client for HDFs is written

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.