Operation of the Java interface on the Hadoop cluster

Source: Internet
Author: User
Tags create directory

Operation of the Java interface on the Hadoop cluster

Start with a configured Hadoop cluster

This is what I implemented in the test class of the project that I built in the SSM framework.

One, under Windows configuration environment variable download file and unzip to C drive or other directory.

Link: http://pan.baidu.com/s/1jHHPElg Password: AUFD

Configuring environment Variables 1. Configuring Hadoop_home

2. Configure path

Add in Path

%HADOOP_HOME%\bin
    • 1
3. Configure Hadoop_user_name

This is the user name for the Hadoop cluster

HADOOP_USER_NAME root
    • 1
Second, MAVEN handles the dependency jar package
    <!--Hadoop Dependency--<Dependency><Groupid>org.apache.hadoop</Groupid><Artifactid>hadoop-client</artifactid> < version>2.7.4</version> </dependency> < dependency> <groupId> Commons-io</groupid> <artifactid>commons-io</artifactId> <version>2.4</ version> </DEPENDENCY> 

/span>
Third, create the test class
Package com.mavenssmlr.hadoop;Import org.apache.hadoop.conf.Configuration;Import Org.apache.hadoop.fs.FileStatus;Import Org.apache.hadoop.fs.FileSystem;Import Org.apache.hadoop.fs.Path;Import Org.apache.hadoop.io.IOUtils;Import Org.junit.Test;Import Org.junit.runner.RunWith;Import Org.slf4j.Logger;Import Org.slf4j.LoggerFactory;Import org.springframework.test.context.ContextConfiguration;Import Org.springframework.test.context.junit4.SpringJUnit4ClassRunner;Import Java.io.FileInputStream;Import Java.io.FileOutputStream;Import Java.io.InputStream;Import Java.io.OutputStream;/** * Java interface operation for HADOOP * 1. Configure environment variables: hadoop_home * hadoop_user_name * Created by Shirukai on 2017/11/2. */@RunWith (Springjunit4classrunner.class)Tell the junit spring configuration file@ContextConfiguration ({"Classpath:spring/spring-dao.xml"})PublicClassTesthadoop {Private Logger Logger = Loggerfactory.getlogger (This.getclass ());/** * Connect to Hadoop * *Public FileSystemConnecthadoop () {String Namenodeurl ="hdfs://10.110.13.243:9000"; String Namenodename ="Fs.defaultfs"; FileSystem fs =Null Configuration Configuration =New Configuration ();try {configuration.set (namenodename, namenodeurl); fs = Filesystem.get (configuration); Logger.info ("Connection succeeded: path={}", Fs.getfilestatus (New Path ("/"))); }catch (Exception e) {logger.error (E.getmessage (), e);}return FS; }/** * Create Directory * *@throws Exception Exception */@TestPublicvoidMkdirfolder ()Throws Exception {FileSystem fs = Connecthadoop (); String FolderName ="/input"; Fs.mkdirs (New Path (FolderName)); }/** * Uploading files to Hadoop * *@throws Exception Exception */@TestPublicvoidUploadFile ()Throws Exception {FileSystem fs = Connecthadoop ();Defines the file path for local uploads String Localfilepath ="d://hadoop//upload//";Define the upload file String fileName ="User.xlsx";Define the folder to upload String Uploadfolder ="/input/"; InputStream in =New FileInputStream (Localfilepath + fileName); OutputStream out = Fs.create (New Path (Uploadfolder + fileName); Ioutils.copybytes (in, out,4096,true); }/** * Get files from Hadoop * *@throws Exception Exception */@TestPublicvoidGetfilefromhadoop ()Throws Exception {FileSystem fs = Connecthadoop ();Define the path to download String Downloadpath = www.meiwanyule.cn"/input/";Define the file name to download String downloadfilename ="User.xlsx";Define the path to save String Savepath ="D://www.mhylpt.com hadoop//download//" + downloadfilename; InputStream in = Fs.open (New Path (Downloadpath + downloadfilename)); OutputStream out =New FileOutputStream (Savepath); Ioutils.copybytes (in, out,4096,true); }/** * Deleting files * Delete (Path,boolean) * Boolean if True, recursive deletion will be done, subdirectories and files will be deleted * False only delete current * *@throws Exception * *@TestPublicvoidDeleteFile ()Throws Exception {FileSystem fs = Connecthadoop (www.dongfan178.com);File path to delete String Deletefilepath ="/inputuser.xlsx"; Boolean Deleteresult = Fs.delete (New Path (Deletefilepath),true); Logger.info ("Delete file: ={}", Deleteresult); }/** * Traverse all files in the specified directory * @throws Exception Exception */ @Test public void getAllFile () throws exception{FileSystem fs = Connecthadoop (); //defines the directory to get the String GetPath =  "/"; filestatus[] statuses = fs.liststatus (new Path (GetPath)); for (Filestatus file:statuses) {logger.info ( "fileName={ www.taohuayuan178.com} ", File.getpath (). GetName ()); }}  @Test public void otheroption (www.yongshiyule178.com) throws exception{FileSystem fs = Connecthadoop (); }}

Operation of the Java interface with Hadoop cluster

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.