Eclipse Imports Hadoop Source project and writes Hadoop program

Source: Internet
Author: User

One

Eclipse Import Hadoop Source project

Basic steps:

1) Create a new Java project "hadoop-1.2.1" in Eclipse

2) Copy the Core,hdfs,mapred,tools,example four directory under the directory src of the Hadoop compression package to the SRC directory of the new project above

3) Right click to select Build path, modify Java Build path "source", delete src, add src/core,src/hdfs,src/mapred,src/tools several source directories;

4) Create a new Lib directory in the project, add the project dependent jar package, import all the jar packages in the Hadoop decompression directory (remove two documents), do not miss the jar package in its subdirectory jsp-2.1, import all the jar packages under the Ant program Lib

5) Select all the jar packages in the Lib directory, then right-click to select Build path to add

6) Right click on Project selection build project, recompile project, find error

7) Import the ant package, import all the jar packages under ant into the Lib directory, right-click the project selection build Path-->libraries-->add Jars-->hadoop-1.2.1-->lib will appear as the added package , click OK to finish.

8) in compiling the project, found or wrong, then right click on Project selection build path-->libraries find JRE System library--> Click access rules:no rules Defined- Select Edit

Resolution box changed to Accessible,rule pattern fill **/* dot ok

This completes.

Ii. writing Hadoop programs under eclipse

Basic steps:

1) New Hadoop project, in the project to create a new Lib directory, copy the Hadoop compression package extracted from the directory lib several jar packages, respectively:

Where Hadoop-core-1.2.1.jar is in the Hadoop-1.2.1 directory and then added to the build path path.

2) Add a configuration file for Hadoop

Click on the project to create a new folder, named Conf, you need to put two configuration files, respectively, Core-site.xml and Hdfs-site.xml

3) Add Unit Test Package JUNIT4

Select Project Right-click to select Build Path-->libraries-->add library-->junit-->junit4-->ok

In this case, we can set up the package and test class to add, delete, change and check the HDFs under the project SRC.

Example: Look at a file class volume under HDFs and display it on the console with the following code:

 PackageOrg.chaofn.hadoop.hdfs;ImportJava.io.InputStream;ImportJava.net.URL;Importorg.apache.hadoop.fs.FsUrlStreamHandlerFactory;Importorg.apache.hadoop.io.IOUtils;Importorg.junit.Test; Public classHdfsurltest {//Let the Java program identify the URL of HDFs    Static{url.seturlstreamhandlerfactory (Newfsurlstreamhandlerfactory ()); }        //View File Contents@Test Public voidTestread ()throwsexception{InputStream in=NULL; //file pathString fileurl= "Hdfs://linux.chaofn.org:9000/wc/input/core-site.xml"; Try{ in=NewURL (FILEURL). OpenStream (); //read the contents of the file and print it to the consoleIoutils.copybytes (in, System.out, 4096,false); }finally{ioutils.closestream (in); }    }    }

Eclipse Imports Hadoop Source project and writes Hadoop program

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.