Linux to configure Eclipse, Hadoop run __linux

Source: Internet
Author: User
Tags gtk hadoop fs

Hadoop version: hadoop-0.20.2
Eclipse Version: eclipse-java-helios-sr2-linux-gtk.tar.gz

======================== installation eclipse=======================

1, first download eclipse. Not much.


2. Install Eclipse
(1) to extract the eclipse-java-helios-sr2-linux-gtk.tar.gz into a directory, I unzipped to the/home/wangxing/development, get the Eclipse directory

(2) Create a startup script eclipse in the/usr/bin directory, and execute the following command to create:
sudo gedit/usr/bin/eclipse

Then add the following content to the file:
#!/bin/sh
Export Mozilla_five_home= "/usr/lib/mozilla/"
Export Eclipse_home= "/home/wangxing/development/eclipse"
$ECLIPSE _home/eclipse $*

(3) Modify the permissions of the script to make it executable and execute the following command:
sudo chmod +x/usr/bin/eclipse

3. Add an icon to the Applications (application) menu
sudo gedit/usr/share/applications/eclipse.desktop

Then add the following content to the file:
[Desktop Entry]
Encoding=utf-8
Name=eclipse Platform
Comment=eclipse IDE
Exec=eclipse
icon=/home/wangxing/development/eclipse/icon.xpm
Terminal=false
Startupnotify=true
Type=application
Categories=application;development;

======================== installation hadoop============================

About Hadoop in Linux under the pseudo-distributed installation details: Hadoop website

======================== in Eclipse configuration hadoop====================
1. Installing the Hadoop plugin on eclipse
Copy the Hadoop installation directory/contrib/eclipse-plugin/hadoop-0.20.203.0-eclipse-plugin.jar to the Eclipse installation directory/plugins/


2. Restart Eclipse and configure Hadoop installation directory.
If the plugin is successful, open Window-->preferens, you will find the Hadoopmap/reduce option, in which you need to configure the Hadoop installationdirectory. Exit after configuration is complete.

3. Configure Map/reduce Locations
Open map/reducelocations in Window-->show View and create a new hadooplocation in map/reduce locations. In this view, right--->new hadooplocation. In the pop-up dialog you need to configure location name, such as Hadoop, and Map/reduce master and Dfsmaster. The host and port are the addresses and ports that you configure in Mapred-site.xml, Core-site.xml, respectively. Such as:


4. New project.
File-->new-->other-->map/reduceproject, the project name can be arbitrarily taken, such as WordCount.
Copy the Hadoop installation directory/src/example/org/apache/hadoop/examples/wordcount.java to the newly created project WordCount, delete Wordcount.java first row package



5. Create a new word.txt locally with the following contents:
Java C + + python Cjava C + + javascript HelloWorld hadoopmapreducejava Hadoop hbase

6. Create the/tmp/workcount directory on the HDFs via the Hadoop command, as follows:
Bin/hadoop Fs-mkdir/tmp/wordcount
Copy the local word.txt to the HDFs via the copyfromlocal command as follows:
Bin/hadoop Fs-copyfromlocal/home/wangxing/development/eclipseworkspace/word.txt/tmp/wordcount/word.txt

7. Running the project
(1). In the new project Hadoop, click Wordcount.java, right button-->runas-->run configurations
(2). In the Run Configurations dialog box that pops up, click Javaapplication, right-click-->new, and then create a new application named WordCount
(3). Configure run parameters, point arguments, enter the input folder you want to pass to the program in Programarguments and the folder where you require the program to save the results, such as:
Hdfs://localhost:9000/tmp/wordcount/word.txt Hdfs://localhost:9000/tmp/wordcount/out
(4) Click Run, running the program
After a period of time will be completed, and so on, after the end of the run, look at the output of the example, using the command:
Bin/hadoop Fs-ls/tmp/wordcount/out
Found two folders and one file, using commands to view the results of the run in part-r-00000:
Bin/hadoop fs-cat/tmp/wordcount/out/part-r-00000


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.