hadoop2.6---development environment under windows

Source: Internet
Author: User

First, prepare the plug-in

1. Compile Yourself

1.1 Installing Ant

    • Official website Download Ant,apache-ant-1.9.6-bin.zip
    • Configure the environment variable, create a new ant_home, the value is appended with ";%ant_home%\bin" after E:\apache-ant-1.9.6;path
    • Test, Ant-version

1.2 Downloads hadoop2.6.0

Http://hadoop.apache.org/releases.html#News, choose the binary of 2.6.0

1.3 Download Hadoop2x-eclipse-plugin Source code

Address: Https://github.com/winghc/hadoop2x-eclipse-plugin, select "Download ZIP" on the right, download.

1.4 Compiling

    • Unzip the Hadoop2x-eclipse-plugin-master.zip, as in the E-drive, then enter: E:\hadoop2x-eclipse-plugin-master\src\contrib\eclipse-plugin
    • Execute ant,ant jar-dversion=2.6.0-dhadoop.version=2.6.0-declipse.home=e:\program Files (x86) \eclipse-dhadoop.home=e:\ hadoop-2.6.0, the front one is the installation directory of Eclipse, followed by the root directory of hadoop2.6.0
    • Build location, under E:\hadoop2x-eclipse-plugin-master\build\contrib\eclipse-plugin directory    

Note: Because my machine has a commons-httpclient package download not down, also did not find the reason, also do not use their own packaging way. One by one +

2, direct download, this is a copy of my download, is really available:

Web address: Link: http://pan.baidu.com/s/1dDwemop Password: idve

Second, eclipse configuration

1. Put the plugin into Eclipse's plugins directory, start Eclipse, and you can see DFS location on the left

2, open Window--Preferences, you can see the Hadoop map/reduce option, select, set the root directory of Hadoop, such as:

    

3. Configuring MapReduce

3.1 Click Windows---Show View--MapReduce Tools to open the Map/reducelocation view

3.2 Locate the Map/reducelocation tab, click the Blue icon on the right, open the Configuration window, enter any location name, configure Map/reduce Master and DFS master, for example:

    

Note: 9001 and 9000 ports to be set open; the MR master and DFS master configurations must be consistent with profiles such as Mapred-site.xml and Core-site.xml

4. Test if the connection is successful

4.1 Log on to the Hadoop server and start the Hadoop service

4.2 View the DFS location on the left, if you see the uploaded file, indicating that the connection was successful, such as:

    

Note: When uploading a file to HDFs on the left, you will be prompted for no write permission, there are three ways to resolve it:

A. Open the Hdfs-site.xml and add the following code, but not for the production environment :    

<property>    <name>dfs.permissions</name>    <value>false</value> </property>

B. Modify the file read and write permissions, such as: Bin/hadoop dfs-chmod 777/IN/INPUT3

C. Windows creates a user like a Hadoop user and opens eclipse with this user. (This person does not personally test, interested can try it yourself)

hadoop2.6---development environment under windows

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.