Ubuntu installs Eclipse, writes MapReduce, compiles hadoop-eclipse plugins

Source: Internet
Author: User
Tags hadoop fs

Original address: http://blog.csdn.net/coolcgp/article/details/43448135, make some changes and additions

First, Ubuntu Software Center installs eclipse

Second, copy the Hadoop-eclipse-plugin-1.2.1.jar to the plug-in directory under the Eclipse installation directory/usr/lib/eclipse/plugins (if you do not know the installation directory for Eclipse, terminal input Whereis Eclipse Lookup. If installed by default, enter the next command directly:

sudo cp hadoop-eclipse-plugin-1.2.1.jar/usr/lib/eclipse/plugins

Add: Hadoop-eclipse-plugin-1.2.1.jar Use the compiled good,

: http://download.csdn.net/detail/poisonchry/7412615

Third, start Eclipse and open the Map/reduce view

1. Select Window-->preference

2. Select Hadoop map/reduce

3. Enter the installation path for Hadoop

Iv. managing HDFs through Eclipse

1.window-->show View -->other-->mapreduce Tools Open Map/reduce Locations

2. Under Map/reduce view, right-click New-->new Hadoop Location

(Create a new Hadoop location in the map/reduce locations.) In this view, right-click-->new Hadoop location. In the popup dialog you need to configure location name, such as Myubuntu, and Map/reduce Master and DFS master. The host and port are the addresses and ports you have configured in Mapred-site.xml, Core-site.xml, respectively. such as:)

3. Managing HDFs

First Open the MapReduce view

Window---Open perspective, other select Map/reduce, the icon is a blue elephant.

Exit after configuration is complete. Click Dfs Locations-->myubuntu If you can display the folder (2) the instructions are configured correctly, if "Deny connection" is displayed, check your configuration ( I only see a folder tmp after my configuration is complete, and I don't know what the reason is ).


V. New MapReduce Project

Enter the input folder and output folder (with spaces in the middle) of the HDFs

Note the input and output address, which is the input and output folder in the server format HDFs file system.

My input and output address is hdfs://192.168.83.51:9000/user/hadoop/input Hdfs://192.168.83.51:9000/user/hadoop/output2

Note Implement to define the folder through Hadoop fs-mkdir input,hadoop fs-mkdir output2

Click "Run" in the lower right corner

Note: The same MapReduce program, to test several times, to delete the "target folder", or run the program, the target folder will be re-established, resulting in the resulting folder generated errors. After you delete the output (for example, above), run it again.

Ubuntu installs Eclipse, writes MapReduce, compiles hadoop-eclipse plugins

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.