Configure and use the hadoop plug-in eclipse

Source: Internet
Author: User
Tags unsupported
Configure and use the hadoop plug-in eclipse
1 ,{
Tagshow (Event)
} "> Environment Configuration
1. Eclipse version 3.3.x
2 ,{
Tagshow (Event)
} "> Hadoop version 0.20.2
Ii. configuration process
1. Move/hadoop-0.20.2/hadoop-0.20.2/contrib/{
Tagshow (Event)
} "> Eclipse
-{
Tagshow (Event)
} "> Hadoop-0.20.2 under plugin-Eclipse-Plugin. jar: CopyEclipse-SDK-3.3.2-win32/Eclipse/Plugins.
2. StartEclipse, Click window-> show view-> other, and click mapreudce tools-> MAP/reduce locations (I have created one ):
{
Showmenu ({'ctrlid': This. ID, 'pos': '13 '})
} ">Download(24.06 KB)

3. Click the blue elephant and the icon appears. Configuration:
The host is the IP address of the master, and the 9001 is mapred .{
Tagshow (Event)
} "> Job. Tracker port and 9000 are the fs. Default. Name port.
{
Showmenu ({'ctrlid': This. ID, 'pos': '13 '})
} ">Download(36.03 KB)

4. The configuration is complete here. It is worth noting that JDK requires 1.6; otherwise, the following message is reported {
Tagshow (Event)
} "> Error message.
Unsupported Major. Minor version 50.0
Unsupported Major. Minor version 49.0
Unsupported Major. Minor version 48.0
Ii. How to Use
1. View {
Tagshow (Event)
} "> File system. ClickEclipseThe blue elephant icon in the upper right corner, and the DFS locations link will appear in the project explorer on the left. Open the link to view the HDFS document structure.
2 ,{
Tagshow (Event)
} "> Run job, run on hadoop. Note the following three points,
A. Remember to configure input and output parameters.
B. Remember to add Conf. Set ("hadoop. Job. ugi", "root, hadoop") to the main function. Root is the user name and hadoop is the password.
C. The project needs to load all the packages in the Lib under hadoop and the packages under the hadoop directory. Of course, not all of them are required. I am in trouble, so I have added them all.
{
Showmenu ({'ctrlid': This. ID, 'pos': '13 '})
} "> {
Showmenu ({'ctrlid': This. ID, 'pos': '12 '})
} "Src =" http://bbs.hadoopor.com/attachment.php? Aid = assist % 3d & noupdate = yes "alt =" 0_128556329670jz.gif "width =" 193 "Height =" 446 ">Download(11.55 KB)

Iv. Summary
1. Plug-ins are easy to use. We recommend that you use them during development and debugging.
2. add or delete the file directory of HDFS through the plug-in. This configuration is not enough for me. There are documents on the Internet that can be configured, But I didn't implement it.
3. When running a job, you must pay attention to the input and output as well as the loading of some jar packages.
4. Some warnings will be issued during execution, without affecting the system. It is said that there are some old configuration files in the system.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.