Hadoop uses Eclipse in Windows 7 to build a Hadoop Development Environment

Source: Internet
Author: User

Hadoop uses Eclipse in Windows 7 to build a Hadoop Development Environment

Some of the websites use Eclipse in Linux to develop Hadoop applications. However, most Java programmers are not so familiar with Linux systems. Therefore, they need to develop Hadoop programs in Windows, it summarizes how to use Eclipse in Windows to develop Hadoop program code.

New Features of Hadoop2.5.2

Install and configure Hadoop2.2.0 on CentOS

Build a Hadoop environment on Ubuntu 13.04

Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1

Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)

Configuration of Hadoop environment in Ubuntu

Detailed tutorial on creating a Hadoop environment for standalone Edition

Build a Hadoop environment (using virtual machines to build two Ubuntu systems in a Winodws environment)

1. Download the dedicated hadoop plug-in jar package

The hadoop version is 2.3.0, The hadoop cluster is built on centos6x, And the plug-in package is

------------------------------------------ Split line ------------------------------------------

FTP address: ftp://ftp1.bkjia.com

Username: ftp1.bkjia.com

Password: www.bkjia.com

In 2014, LinuxIDC.com \ December \ Hadoop used Eclipse in Windows 7 to build a Hadoop development environment.

For the download method, see

------------------------------------------ Split line ------------------------------------------

Jar package name is hadoop-eclipse-plugin-2.3.0, can be applied to hadoop2x series software version.

2. Put the plug-in package under the eclipse/plugins directory.

For future convenience, I put as many jar packages as possible here, as shown in:

3. Restart eclipse and configure Hadoop installation directory

If the plug-in is successfully installed, open Windows-Preferences and the Hadoop Map/Reduce option on the left side of the window. Click this option to set the Hadoop installation path on the right side of the window.

4. Configure Map/Reduce Locations

Open Windows --> Open Perspective --> Other

Select Map/Reduce and click OK. A Map/Reduce Locations icon is displayed in the lower right corner, as shown in:

Click the Map/Reduce Location tab, and click the elephant icon on the right to open the Hadoop Location Configuration window:

Enter Location Name, any Name. Configure Map/Reduce Master and DFS Mastrer, Host and Port to be consistent with the core-site.xml settings.

Go to find core-site.xml Configuration:

Fs. default. name hdfs: // name01: 9000

The interface configuration is as follows:

Click "Finish" to close the window. Click DFSLocations-> myhadoop (location name configured in the previous step) on the left. If the user is displayed, the installation is successful, but the Error message "Error: Permission denied: user = root" is displayed, access = READ_EXECUTE, inode = "/tmp"; hadoop: supergroup: drwx ---------, as shown in:

It should be a permission issue: set all the hadoop-related folders under the/tmp/directory to all hadoop users and assign 777 permissions.

Cd/tmp/

Chmod 777/tmp/

Chown-R hadoop. hadoop/tmp/hsperfdata_root

Then, reconnect and open DFS Locations.

Map/Reduce Master (this is the Map/Reduce address of the Hadoop cluster, which should be the same as the mapred. job. tracker setting in the mapred-site.xml)

(1) Click to report an error:

An internal error occurred during: "Connecting to DFS hadoopname01 ".

Java.net. UnknownHostException: name01

Set the IP address to 192.168.52.128 directly in the hostname column, so that it can be opened normally, as shown in:

5. Create a WordCount Project

File-> Project, select Map/Reduce Project, and enter the Project name WordCount.

Create a class in the WordCount project named WordCount. The error code is as follows: Invalid Hadoop Runtime specified; please click 'configure Hadoop install directory' or fill in library location input field, the error is caused by incorrect directory selection. You cannot select E: \ u \ hadoop \ under the directory E: \ hadoop, as shown below:

All the way to the next step, click Finished to complete the project creation. The following information appears in the Eclipse console:

14-12-9 04:03:10 P.M.: Eclipse is running in a JRE, but a JDK is required

Some Maven plugins may not work when importing projects or updating source folders.

14-12-9 04:03:13 P.M.: Refreshing [/WordCount/pom. xml]

14-12-9 04:03:14 P.M.: Refreshing [/WordCount/pom. xml]

14-12-9 04:03:14 P.M.: Refreshing [/WordCount/pom. xml]

14-12-9 04:03:14 P.M.: Updating index central | http://repo1.maven.org/maven2

14-12-9 04:04:10 P.M.: Updated index for central | http://repo1.maven.org/maven2

For more details, please continue to read the highlights on the next page:

  • 1
  • 2
  • Next Page

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.