Eclipse Configuration Execute Hadoop 2.7 Program sample steps

Source: Internet
Author: User
Tags log4j

Premise: You have built a Linux environment for Hadoop 2.x and can execute it successfully. There is also a window to access the cluster. Over

1.

Hfds-site.xml Add attribute: Turn off permissions validation for the cluster. Windows users are generally not the same as Linux, just shut it down. Remember, not core-site.xml reboot the cluster
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
2.Hadoop-eclipse-plugin-2.7.0.jar put the plugin under the Plugins folder

3.Copy the Linux Master installed Hadoop folder to Windows, and then eclipse Configure the folder to locate Hadoop, restart Eclipse

4. Open the configuration, Hadoop option, and locate the Hadoop home folder. ThenOpen the MapReduce View settings Configuration Properties: Other additional properties do not configure, username also do not change, at this time can be linked.


5.New Mrproject

6.build WordCount Class (copy it Yourself)
Create a new log4j.properties directly below SRC:
Content such as the following: mainly information printing: (You can change info to debug, too much content, changed to info)
Log4j.rootlogger=info,stdout,r
Log4j.appender.stdout=org.apache.log4j.consoleappender
Log4j.appender.stdout.layout=org.apache.log4j.patternlayout
log4j.appender.stdout.layout.conversionpattern=%5p-%m%n
Log4j.appender.r=org.apache.log4j.rollingfileappender
Log4j.appender.r.file=mapreduce_test.log
Log4j.appender.r.maxfilesize=1mb
Log4j.appender.r.maxbackupindex=1
Log4j.appender.r.layout=org.apache.log4j.patternlayout
log4j.appender.r.layout.conversionpattern=%p%t%c-%m%n
Log4j.logger.com.codefutures=debug

7, a lot of mistakes AH:
Executive Report:1), java.io.IOException : Hadoop_home or Hadoop.home.dir is not set.2) java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

Solution 1) Others tipsHadoop_homeWithout settings, configure the environment variables for Windows. Navigate to the Hadoop folder
Executive Report:
The first item is not reported, and the second continues:
2) java.io.IOException: Could not locate executable Null\bin\winutils.exe in the Hadoop binaries.
Said it couldn't be positioned.Hadoop_homedoes not work, then add the code in main, otherwise do not report null:
System. SetProperty ("Hadoop.home.dir" , "E:\\BIGDATA\\HADOOP2" );Executive Report:
Could not locate executable E:\bigdata\hadoop2\bin\winutils.exe in the Hadoop binaries.
This time changed, check the folder, indeed no Winutils.exe, because we are copied from the Linux
Then add the chant:fromHttps://github.com/srccodes/hadoop-common-2.2.0-binDownload the bin folder and replace it with the Bin folder in Hadoop in Windows
Executive Report:
The second item is not reported. It's new again:
Exception in thread "main"  org.apache.hadoop.mapreduce.lib.input.InvalidInputException : Input path does not exist: file:/inputWell, hey, the number is set to absolute. HADOOP1, now is the second generation:
set the number of parameters asHdfs://master:9000/input Hdfs://master:9000/output
Execution continues to error:
Exception in thread "main" java.lang.unsatisfiedlinkerror:org.apache.hadoop.io.nativeio.nativeio$ WINDOWS.ACCESS0 (ljava/lang/string;i) ZIf it's a local repository problem, change the source code to block the local library call:

Copy the source code file Org.apache.hadoop.io.nativeio.NativeIO into the project. Positioning: Then navigate to line 570 (don't hadoop2.2 yes. The version number is not the same today 2.7 is 607 rows or direct search Public  Static  Boolean  Access(String  Path, Accessright desiredaccess) Bar,), direct change to return true; Don't let access anymore.


Copy the trouble, then create a new class, locate the package, and then copy the contents of the class.





Continue to execute the error:

  org.apache.hadoop.security.AccessControlException: Permission denied:user=administrator, Access=write, inode= "/output/_temporary/0": root:supergroup:drwxr-xr-x Insufficient permissions, Linux user is root,window Default user isAdministrator I have changed window to root, it seems that this method has not worked. In addition I also added in Core-site.xml do not let the check permission ah. No use. Set the number of parameters again: In another way, the permission set before Core-site.xml is false. Change to Hdfs-site.xml to try, suppose not to change source code.

( so I'm in the first step of the tip is in HDFs instead of core inside change )

Carry on the report, do not report, the result:

Input Split bytes=100Combine Input records=179Combine Output records=131Reduce Input groups=131Reduce Shuffle bytes=1836Reduce Input records=131Reduce Output records=131
For:warn-unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable this warning is fine, performed at Linu X will not be reported.

8.last. Attempts to split the wordcount into subclasses. Try to move the mapper out, because sometimes multiple classes will prompt for an error:Delete the output folder and execute again: OK. No problem. The results are correct.
9.Welcome to visit hegou.me This site。 What, just build a test environment in this machine. Don't run in a formal cluster at work. It's better to test the code first. Over





Eclipse Configuration Execute Hadoop 2.7 Program sample steps

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.