The Java Client for HDFs is written

Source: Internet
Author: User
Tags gz file xsl

  

Note: All of the following code is written in the Linux eclipse.

1. First test the files downloaded from HDFs:

code to download the file: ( download the hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz file to the local/opt/download/doload.tgz)

 PackageCn.qlq.hdfs;ImportJava.io.FileOutputStream;Importjava.io.IOException;Importorg.apache.commons.compress.utils.IOUtils;Importorg.apache.hadoop.conf.Configuration;ImportOrg.apache.hadoop.fs.FSDataInputStream;ImportOrg.apache.hadoop.fs.FileSystem;ImportOrg.apache.hadoop.fs.Path; Public classHdfsutil { Public Static voidMain (String a[])throwsIOException {//To upload a fileConfiguration conf =NewConfiguration (); FileSystem FS=filesystem.get (conf); Path Path=NewPath ("hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz"); Fsdatainputstream input=Fs.open (path); FileOutputStream Output=NewFileOutputStream ("/opt/download/doload.tgz");    Ioutils.copy (input, output); }}

Direct Operation Error:

The reason is that the program does not recognize directories like hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz

Log4j:warn No Appenders could is found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). Log4j:warn Initialize the log4j system Properly.log4j:WARN see http://logging.apache.org/log4j/1.2/faq.html#noconfig for More info. Exception in thread "main" Java.lang.IllegalArgumentException:Wrong fs:hdfs://localhost:9000/  Jdk-7u65-linux-i586.tar.gz, expected:file:///at Org.apache.hadoop.fs.FileSystem.checkPath (filesystem.java:643) at Org.apache.hadoop.fs.RawLocalFileSystem.pathToFile (rawlocalfilesystem.java:79) at Org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus (rawlocalfilesystem.java:506) at Org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal (rawlocalfilesystem.java:724) at Org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus (rawlocalfilesystem.java:501) at Org.apache.hadoop.fs.FilterFileSystem.getFileStatus (filterfilesystem.java:397) at Org.apache.hadoop.fs.checksumfilesystem$checksumfsinputchecker.<Init>(checksumfilesystem.java:137) at Org.apache.hadoop.fs.ChecksumFileSystem.open (checksumfilesystem.java:339) at Org.apache.hadoop.fs.FileSystem.open (filesystem.java:764) at Cn.qlq.hdfs.HdfsUtil.main (hdfsutil.java:21)

Workaround:

    • The First: copy the core-site.xml from the ETC directory in the Hadoop installation directory to the SRC directory of Eclipse. So you don't get an error.

Operation Result:

140224 1 143588167  - in:pwd/opt/1402241143588167  - : doload.tgz

    • The second type: Modify directly in the program

Let's start by looking at what's in Hdfs-site.xml:

<?XML version= "1.0" encoding= "UTF-8"?><?xml-stylesheet type= "text/xsl" href= "configuration.xsl "?><Configuration>< Property><name>Fs.defaultfs</name><value>hdfs://localhost:9000</value></ Property>< Property><name>Hadoop.tmp.dir</name><value>/opt/hadoop/hadoop-2.4.1/data/</value></ Property></Configuration>

Change the code to:

     Public Static voidMain (String a[])throwsIOException {//To upload a filedConfiguration conf =NewConfiguration (); //set HDFs root dir       conf.set ("Fs.defaultfs", "hdfs://localhost:9000"); FileSystem FS=filesystem.get (conf); Path Path=NewPath ("hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz"); Fsdatainputstream input=Fs.open (path); FileOutputStream Output=NewFileOutputStream ("/opt/download/doload.tgz");    Ioutils.copy (input, output); }

2. The following code demonstrates the basic operation of HDFs:
 PackageCn.qlq.hdfs;ImportJava.io.FileInputStream;Importjava.io.FileNotFoundException;ImportJava.io.FileOutputStream;Importjava.io.IOException;ImportJava.net.URI;Importjava.net.URISyntaxException;Importorg.apache.commons.compress.utils.IOUtils;Importorg.apache.hadoop.conf.Configuration;ImportOrg.apache.hadoop.fs.FSDataInputStream;ImportOrg.apache.hadoop.fs.FSDataOutputStream;ImportOrg.apache.hadoop.fs.FileStatus;ImportOrg.apache.hadoop.fs.FileSystem;ImportOrg.apache.hadoop.fs.LocatedFileStatus;ImportOrg.apache.hadoop.fs.Path;ImportOrg.apache.hadoop.fs.RemoteIterator;ImportOrg.junit.Before;Importorg.junit.Test; Public classHdfsutil {PrivateFileSystem fs =NULL; @Before Public voidBefor ()throwsIOException, interruptedexception, urisyntaxexception{//reads the xxx-site.xml configuration file under Classpath and parses its contents into a Conf objectConfiguration conf =NewConfiguration (); //You can also manually set the configuration information in the Conf in your code, overwriting the read value in the configuration fileConf.set ("Fs.defaultfs", "Hdfs://localhost:9000/"); //to obtain a client action instance object for a specific file system, based on the configuration informationFS = Filesystem.get (NewURI ("Hdfs://localhost:9000/"), conf, "root"); }        /*** Upload files, compare the bottom of the wording * *@throwsException*/@Test Public voidUpload ()throwsException {Path DST=NewPath ("Hdfs://localhost:9000/aa/qingshu.txt"); Fsdataoutputstream OS=fs.create (DST); FileInputStream is=NewFileInputStream ("/opt/download/haha.txt");            Ioutils.copy (is, OS); }        /*** Upload files, packaged in a good writing *@throwsException *@throwsIOException*/@Test Public voidUpload2 ()throwsException, ioexception{fs.copyfromlocalfile (NewPath ("/opt/download/haha.txt"),NewPath ("Hdfs://localhost:9000/aa/qingshu2.txt")); }                /*** Download File *@throwsIOException*/@Test Public  voidDownload ()throwsIOException {path Path=NewPath ("hdfs://localhost:9000/jdk-7u65-linux-i586.tar.gz"); Fsdatainputstream input=Fs.open (path); FileOutputStream Output=NewFileOutputStream ("/opt/download/doload.tgz");    Ioutils.copy (input, output); }        /*** Download File *@throwsException *@throwsillegalargumentexception*/@Test Public voidDownload2 ()throwsException {fs.copytolocalfile (NewPath ("Hdfs://localhost:9000/aa/qingshu2.txt"),NewPath ("/opt/download/haha2.txt")); }        /*** View file information *@throwsIOException *@throwsIllegalArgumentException *@throwsFileNotFoundException **/@Test Public voidListfiles ()throwsFileNotFoundException, IllegalArgumentException, IOException {//Listfiles Lists the file information, and provides recursive traversalremoteiterator<locatedfilestatus> files = fs.listfiles (NewPath ("/"),true);  while(Files.hasnext ()) {Locatedfilestatus file=Files.next (); Path FilePath=File.getpath (); String FileName=Filepath.getname ();                    System.out.println (FileName); } System.out.println ("---------------------------------"); //Liststatus can list information about files and folders, but does not provide self-recursive traversalfilestatus[] Liststatus = Fs.liststatus (NewPath ("/"));  for(Filestatus status:liststatus) {String name=Status.getpath (). GetName (); SYSTEM.OUT.PRINTLN (Name+ (Status.isdirectory ()? "is dir": "Is File")); }            }    /*** Create folder *@throwsException *@throwsillegalargumentexception*/@Test Public voidmkdir ()throwsIllegalArgumentException, Exception {fs.mkdirs (NewPath ("/AAA/BBB/CCC")); }    /*** Delete files or folders *@throwsIOException *@throwsillegalargumentexception*/@Test Public voidRM ()throwsIllegalArgumentException, IOException {fs.delete (NewPath ("/aa"),true); }    }

The Java Client for HDFs is written

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.