Hadoop fully distributed Eclipse development environment configuration

Source: Internet
Author: User

Hadoop Secure distributed Eclipse Development environment Configuration

Install Eclipse:

See tutorial for details:http://blog.csdn.net/wang_zhenwei/article/details/48032001

Install hadoop-eclipse-plugin:

DownloadHadoop2x-eclipse-pluginand WillReleasein theHadoop-eclipse-kepler-plugin-2.2.0.jar(although the label is2.2.0, but in2.6.0There is no problem, it should be2.xversion can be copied toEclipseof the installation directorypluginfolder, runEclipse-clean RestartEclipsecan be

:https://github.com/winghc/hadoop2x-eclipse-plugin

Note: Copy the hadoop-eclipse-plugin to the home/haoop/ directory

(This step can be done directly under the ordinary user, if under the root user, to enter the /home/hadoop/ directory first)

Unzip./hadoop2x-eclipse-plugin-master.zip

Extract

(Enter root user)

Cp/home/hadoop/hadoop2x-eclipse-plugin-master/release/hadoop-eclipse-kepler-plugin-2.2.0.jar/usr/local/eclipse /eclipse/plugins/

Copy

/usr/local/eclipse/eclipse/eclipse-clean

Restart

After you start Eclipse , you can see DFS Locations in the Project Explorer on the left(if you see Welcome interface, in the top left corner, click x close to see it)

(Note: You can Select the dialog box to display at Windows-showview)

further configuration of the plugin:

The first step:

Select Preference under the Window Menu

A form pops up with the Hadoop map/reduce option on the left side of the form, Click this option to select the hadoop installation directory (e.g. / Usr/local/hadoop, direct input also available)

Step Two:

Switch map/reduce working directory, select Perspective, Open Perspective, under the Window menu other, pop up a form and choose the map/reduce option to switch

Step Three:

To establish a connection to the Hadoop cluster, click the map/reduce Locations panel in the lower right corner of the Eclipse software, right-clicking in the Panel, Select New Hadooplocation.


Note: Information description

Location Name: Can be arbitrary, identify a "map/reduce location"

Map/reduce Master

Host:192.168.154.156( IP address of Master )

Port:9001

DFS Master

Use M/R Master host: On the front hook. (Because our NameNode and jobtracker are on a machine.) )

Port:9000

User name:Hadoop(the default is the Win system administrator name, because we changed it before, so this becomes Hadoop . )

Then switch to the advanced parameters Options panel, which is configured in detail here, remembering the need to configure with Hadoop (/usr/local/hadoop /etc/hadoop in the configuration file ) , if hadoop.tmp.diris configured, it is necessary to modify it.

/usr/local/hadoop/tmp

If Hadoop is not running at this point, you will see the following results.

Start Hadoopand the results match your expectations.

Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

Hadoop fully distributed Eclipse development environment configuration

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.