Eclipse Configuration Run Hadoop 2.7 Program Example Reference step

Source: Internet
Author: User
Tags log4j

Premise: You have built a Hadoop 2.x Linux environment and are able to run successfully. There is also a window that can access the cluster. Over

1.

Hfds-site.xml Add attribute: Turn off the permissions check of the cluster, Windows users are generally not the same as the Linux, directly shut it down OK. Remember, it's not core-site.xml rebooting the cluster.
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
2.Hadoop-eclipse-plugin-2.7.0.jar put the plugin into the plugins directory

3.Copy the Linux master installed Hadoop directory to Windows, and then eclipse Configure the directory to locate Hadoop, restart Eclipse

4. Open the configuration, Hadoop option, and locate the Hadoop home directory. ThenOpen mapreduce View Settings Configuration Properties: Other additional properties do not configure, the user name is not changed, at this time can be linked.


5.New Mr Project

6.Build the WordCount class (copy it Yourself).
Create a new log4j.properties directly below SRC:
The content is as follows: Mainly print information: (You can change info to debug, too much content, change to info)
Log4j.rootlogger=info,stdout,r
Log4j.appender.stdout=org.apache.log4j.consoleappender
Log4j.appender.stdout.layout=org.apache.log4j.patternlayout
log4j.appender.stdout.layout.conversionpattern=%5p-%m%n
Log4j.appender.r=org.apache.log4j.rollingfileappender
Log4j.appender.r.file=mapreduce_test.log
Log4j.appender.r.maxfilesize=1mb
Log4j.appender.r.maxbackupindex=1
Log4j.appender.r.layout=org.apache.log4j.patternlayout
log4j.appender.r.layout.conversionpattern=%p%t%c-%m%n
Log4j.logger.com.codefutures=debug

7, a lot of mistakes AH:
Run Report:1), java.io.IOException : Hadoop_home or Hadoop.home.dir is not set.2) java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

Solution 1) Others tipsHadoop_homeWithout settings, configure Windows environment variables to navigate to the Hadoop directory
Run Report:
The first item is not reported, and the second continues:
2) java.io.IOException: Could not locate executable Null\bin\winutils.exe in the Hadoop binaries.
Said it couldn't be positioned.Hadoop_homedoes not work, then add the code in main, otherwise do not report null:
System. SetProperty ("Hadoop.home.dir" , "E:\\BIGDATA\\HADOOP2" );Run Report:
Could not locate executable E:\bigdata\hadoop2\bin\winutils.exe in the Hadoop binaries.
This time the change, check the directory, indeed no Winutils.exe, because we are copied from the Linux
Then increase the chant:fromHttps://github.com/srccodes/hadoop-common-2.2.0-binDownload the bin directory and replace it with the bin directory in Hadoop in Windows
Run Report:
The second item is not reported, another new:
Exception in thread "main"  org.apache.hadoop.mapreduce.lib.input.InvalidInputException : Input path does not exist: file:/inputWell, hey, the parameters are set to absolute, hadoop1 when not, now is the second generation:
sweetie, set the parameter toHdfs://master:9000/input Hdfs://master:9000/output
Run Continue error:
Exception in thread "main" java.lang.unsatisfiedlinkerror:org.apache.hadoop.io.nativeio.nativeio$ WINDOWS.ACCESS0 (ljava/lang/string;i) ZThat is the problem of the local library, then change the source, the local library calls to block out:

copy source files Org.apache.hadoop.io.nativeio.NativeIO to the project, locate: Then navigate to line 570 (don't hadoop2.2 Yes, the version is different now 2.7 is 607 lines or direct search Public  Static  Boolean  Access(String  Path, Accessright desiredaccess) Bar,), directly modified to return true; Don't let access anymore.
Copy the trouble, then create a new class, locate the package, and then copy the contents of the class.



Continue to run Error:
  org.apache.hadoop.security.AccessControlException: Permission denied:user=administrator, Access=write, inode= "/output/_temporary/0": root:supergroup:drwxr-xr-x Insufficient permissions, Linux user is root,window Default user isAdministrator I have changed window to root, it seems that this method has not worked. In addition, I also added in the Core-site.xml do not let the check permission ah, no matter. Reset parameters: In another way, before the Core-site.xml set the permissions to False, change to Hdfs-site.xml to try, if not change the source code. ( so I'm in the first step of the tip is in HDFs instead of core inside change )

Run continue to report, do not report, the result:
Input Split bytes=100Combine Input records=179Combine Output records=131Reduce Input groups=131Reduce Shuffle bytes=1836Reduce Input records=131Reduce Output records=131
For:warn-unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable this warning is fine, run It's not reported on Linux.

8.Finally, try to split the wordcount into subclasses and move the mapper out of the way, because sometimes multiple classes will prompt for an error:Delete Output directory, rerun: OK, no problem, the results are correct.
9.Welcome to the next hegou.me this site。 How, just build a test environment in this machine. Don't run in a formal cluster at work. Well, just test the code first. Over





Eclipse Configuration Run Hadoop 2.7 Program Example Reference step

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.