Configure the Hadoop application environment developed by Eclipse in Ubuntu

Source: Internet
Author: User
Hello everyone, today I will introduce you to the configuration of the Hadoop application environment developed by eclipse under Ubuntu. The purpose is very simple. To conduct research and learning, deploy a hadoop runtime environment, build a hadoop development and testing environment. Environment: Ubuntu12.04 Step 1: Download eclipse-sdk-4.2.1-linux-gtk.tar.gz http://mirrors.ustc.edu.cn/eclipse/eclipse/dow

Hello everyone, today I will introduce you to the configuration of the Hadoop application environment developed by eclipse under Ubuntu. The purpose is very simple. To conduct research and learning, deploy a hadoop runtime environment, build a hadoop development and testing environment.

Environment: Ubuntu12.04

Step 1: Download eclipse-sdk-4.2.1-linux-gtk.tar.gz

Http://mirrors.ustc.edu.cn/eclipse/eclipse/downloads/drops4/R-4.2.1-201209141800/eclipse-SDK-4.2.1-linux-gtk.tar.gz

Note: To download 32-bit eclipse in linux, do not download 64-bit eclipse. Otherwise, eclipse cannot be started.

Step 2: download the latest version of hadoop plug-in

Download it to the FTP server No. 1 of the Linux community ,:

FTP address: ftp://www.linuxidc.com

Username: www.linuxidc.com

Password: www. muu. cc

In 2013, LinuxIDC.com \ January \ Ubuntu, Eclipse developed Hadoop application environment Configuration

Download Method see http://www.linuxidc.net/thread-1187-1-1.html

Copy the hadoop-1.0.4-eclipse-plugin.jar to the eclipse/plugins directory and restart eclipse.

1. There is a DFS locations flag in the header on the project explorer on the left.

2. In windows-> preferences, A hadoop map/reduce option is added. Select this option and select the downloaded hadoop root directory on the right.

If you can see the above two points, the installation is successful.

Step 3: configure the hadoop path

Window-> Preferences select "Hadoop Map/Reduce" and click "Browse..." to select the path of the Hadoop folder.

This step has nothing to do with the running environment. It only means that all the jar packages under the hadoop root directory and lib directory can be automatically imported when the project is created.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.