How to debug hadoop notes on Eclipse

Source: Internet
Author: User

 

Configure debug1
Go to hadoop-Debug under hadoop/bin.

......

Elif ["$ command" = "tasktracker"]; then

Class = org. Apache. hadoop. mapred. tasktracker

Hadoop_opts = "$ hadoop_opts $ hadoop_tasktracker_opts-xdebug-xrunjdwp: Transport = dt_socket, Server = Y, suspend = Y, address = 9005"

......

2

Entering mapred-site.xml

<Property>

<Name> mapred. Map. Child. java. opts </Name>

<Value>-xmx2000m-agentlib: jdwp = transport = dt_socket, address = 9010, Server = Y, suspend = Y </value>

</Property>

Start hadoop-debugeclipse and enter the run as debug configuration my mapreduce project Note: * Create a mapreduce project, replace the hadoop-core.jar * set the mapreduce port * New Parameter path, set the input parameters

Step

Modify the mapred-site.xml file to add the following Configuration:

XML Code

<Property>

<Name> mapred. Child. java. opts </Name>

<Value>-agentlib: jdwp = transport = dt_socket, address = 8883, Server = Y, suspend = Y </value>

</Property>

Disable all tasktrackers and retain only one configured tasktracker to be debugged.

Start mapreduce job

Right-click the hadoop SRC project, right-click "Debug as", select "Debug deployments", select "remote Java application", add a new test, and enter the remote host IP address and listening port, the preceding example is 8883, and then click "debug. In this case, you should connect to the remote tasktracker child process and enter the breakpoint location. You can debug it in one step.

Example of VM calling in eclipse connection Socket mode

Java-xdebug-xrunjdwp: Transport = dt_socket, Server = Y, address = "8000"-jar test. Jar

Use the remote STARTUP configuration to start eclipse and specify the target VM address of the Remote Application. To do this, click Run> debug deployments and double-click Remote Java application in the eclipse menu. Specify the IP address and port for the target application from the newly created STARTUP configuration. To run remote applications on the same machine, you only need to specify the Host IP address as localhost or 127.0.0.1.

Question 1: Windows hadoop HDFS failed to set permissions of path?

A:

Replace the hadoop-core-1.0.2.jar referenced by the project with a hadoop-core-0.20.2.jar.

Or download the modified jar package.

Https://skydrive.live.com /? Cid = cf7746837803bc50 & id = cf7746837803bc50 % 211276 & authkey =! Ajccrnrx9rcf6fa

Put org. Apache. hadoop. fs. fileutil in the hadoop source code in the project, modify fileutil. checkreturnvalue by yourself, so that this exception is not reported in windows.

Or re-compile the hadoop jar.

Question 2: hadoop: input path does not exist exception?

A:

Http://blog.csdn.net/longzaitianguo/article/details/6773468

Because the local input directory is not uploaded to HDFS, The Org. Apache. hadoop. mapred. invalidinputexception occurs:

Input path does not exist: HDFS: // localhost: 9000/user/root/Input

Solution: Create the input directory under eclipse and configure it to the input parameters of the program.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.