I. build your own development environment
Today, I built a set of Centos5.3 + Hadoop2.2 + Hbase0.96.1.1 development environment, Win7 Eclipse debug MapReduce success. May be the version of the reason for the high, out of the problem, the Internet can not find a complete solution, only on their own.
two. Hadoop installation
This is not verbose, online a lot of articles. I downloaded the hadoop-2.2.0.tar.gz.
- Http://www.cnblogs.com/xia520pi/archive/2012/05/16/2503949.html describes the installation of Hadoop HDFs in great detail. It's not Hadoop2.2, but the configuration is very similar.
- The MapReduce configuration can refer to http://blog.sina.com.cn/s/blog_546abd9f0101i8b8.html.
After the installation is successful, the following pages can be viewed smoothly, OK. My cluster environment is 200master,201-203slave.
- Dfs.http.address 192.168.1.200:50070
- Dfs.secondary.http.address 192.168.1.200:50090
- Dfs.datanode.http.address 192.168.1.201:50075
- Yarn.resourcemanager.webapp.address 192.168.1.200:50030
- Mapreduce.jobhistory.webapp.address 192.168.1.200:19888. It seems to be unable to access. You need to start hadoop/sbin/mr-jobhistory-daemon.sh start Historyserver to access it.
three. hadoop2.x eclispe-plugin
Https://github.com/winghc/hadoop2x-eclipse-plugin
At present, the plug-in is still in development, you can download the source code compiled, the web also released a compiled jar.
Http://blog.csdn.net/zythy/article/details/17397153 the classmate wrote it in detail.
It is important to note that Hadoop installation directory fills in the Hadoop home address under win, and is designed to create a jar where MapReduce project can automatically introduce the MapReduce needed from this place. Unzip the hadoop-2.2.0.tar.gz to local.
four. various issues
When the previous step is complete, create a mapreduce Project, and the runtime discovers a problem.
- Java code
- Java.io.IOException:Could not locate executable Null\bin\winutils.exe in the Hadoop binaries.
With the code to find out is the problem of hadoop_home. If Hadoop_home is empty, it must be fullexename to Null\bin\winutils.exe. The solution is very simple, obediently configure the environment variable bar, do not want to restart the computer can be added to the MapReduce program System.setproperty ("Hadoop.home.dir", "..."); temporarily slowly. Org.apache.hadoop.util.Shell.javaJava Code
public static final string Getqualifiedbinpath (string executable)
throws IOException {
//construct Hadoop bin path to the specified executable
String fullexename = hadoop_home_dir + file.separator + "Bin"
+ File.separator + executable;
File Exefile = new File (Fullexename);
if (!exefile.exists ()) {
throw New IOException ("Could not locate executable" + fullexename
+ "in the Hadoop binaries.");
}
return Exefile.getcanonicalpath ();
}
Private static String Hadoop_home_dir = Checkhadoophome ();
Private static String Checkhadoophome () {
//First check the Dflag hadoop.home.dir with JVM scope
String home = System.getproperty ("Hadoop.home.dir");
//Fall back to the System/user-global env variable
if (home = = null) {
Home = system.getenv ("Hadoop_home");
}
...
}
-
- This time get the full address fullexename, on my machine is D:\Hadoop\tar\hadoop-2.2.0\hadoop-2.2.0\bin\winutils.exe. Continue executing code and find the wrong Java code
- Could not locate executable D:\Hadoop\tar\hadoop-2.2. 0\hadoop-2.2. 0\bin\winutils.exe in the Hadoop binaries.
Just to see, there is no winutils.exe this thing. Go to Https://github.com/srccodes/hadoop-common-2.2.0-bin to download one, put on it.
- Continue with the problem Java code
- At Org.apache.hadoop.util.Shell.execCommand (Shell.java:661)
- At Org.apache.hadoop.fs.RawLocalFileSystem.setPermission (Rawlocalfilesystem.java:639)
- At Org.apache.hadoop.fs.RawLocalFileSystem.mkdirs (Rawlocalfilesystem.java:435)
Continue with Code Org.apache.hadoop.util.Shell.javaJava code
-
- Public Static string[] Getsetpermissioncommand (String Perm, boolean recursive,
-
- String file) {
-
- string[] Basecmd = Getsetpermissioncommand (perm, recursive);
-
- string[] Cmdwithfile = arrays.copyof (basecmd, basecmd.length + 1);
-
- Cmdwithfile[cmdwithfile.length- 1] = file;
-
- return cmdwithfile;
-
- }
-
-
-
- /** Return a command to set permission */
-
- Public Static string[] Getsetpermissioncommand (String Perm, boolean recursive) {
-
- if (recursive) {
-
- return (WINDOWS)? New string[] {winutils, "chmod", "-R", perm}
-
- : new string[] { "chmod", "-R", Perm};
-
- } Else {
-
- return (WINDOWS)? New string[] {winutils, "chmod", perm}
-
- : new string[] { "chmod", perm};
-
- }
-
- }
The contents of the Cmdwithfile array are {"D:\Hadoop\tar\hadoop-2.2.0\hadoop-2.2.0\bin\winutils.exe", "chmod", "755", "Xxxfile"}, I took this alone in CMD and found Java code
- This program cannot be started because the computer is missing MSVCR100.dll
Then download a chant http://files.cnblogs.com/sirkevin/msvcr100.rar and throw it into C:\Windows\System32. Again cmd execution, again the problem Java code
- Application does not start properly (0xc000007b)
Download http://blog.csdn.net/vbcom/article/details/7245186 Directx_repair to solve this problem. Remember to restart your computer after the repair is finished. It's great to try it after the cmd is done.
- Here we have seen the dawn, but the problem comes again Java code
- Exception in thread "main" java.lang.unsatisfiedlinkerror:org.apache.hadoop.io.nativeio.nativeio$ WINDOWS.ACCESS0 (ljava/lang/string;i) Z
Code goes to Java code
- /** Windows only method used to check if the current process has requested
- * Access rights on the given path. */
- Private static native boolean access0 (String path, int requestedaccess);
Obviously missing DLL file, remember https://github.com/srccodes/ Hadoop-common-2.2.0-bin Download The thing, there is hadoop.dll, the best way is to replace the local Hadoop bin directory with the Hadoop-common-2.2.0-bin-master/bin directory, and configure the Path=hadoop_home/bin in the environment variable to restart the computer.
-
- Finally saw the correct output of the MapReduce output99.
Five. Summary
- Hadoop Eclipse Plugin is not necessary, its role in my opinion is the following three points (this is a wrong understanding, please refer to http://zy19982004.iteye.com/blog/2031172). Study-hadoop is an ordinary project that runs directly (without running on the Hadoop elephant) and can be debugged to mapreduce.
- Visualize files in Hadoop.
- Creating a MapReduce project helps you introduce dependent jars.
- Configuration conf = new Config (), when it contains all of the configured information.
- Or you download hadoop2.2 source code compiled well, there should be no problem (no pro-test).
Six. Other issues
- or Java code
- Exception in thread "main" java.lang.unsatisfiedlinkerror:org.apache.hadoop.io.nativeio.nativeio$ WINDOWS.ACCESS0 (ljava/lang/string;i) Z
The code followed the Org.apache.hadoop.util.NativeCodeLoader.java to see the Java code.
-
- static {
-
- //Try to load native Hadoop library and set fallback flag appropriately
-
- if (log.isdebugenabled ()) {
-
- Log.debug ("Trying to load the custom-built Native-hadoop library ...");
-
- }
-
- try {
-
- System.loadlibrary ("Hadoop");
-
- Log.debug ("Loaded the Native-hadoop Library");
-
- nativecodeloaded = true;
-
- } catch (Throwable t) {
-
- //Ignore failure to load
-
- if (log.isdebugenabled ()) {
-
- Log.debug ("Failed to load Native-hadoop with error:" + t);
-
- Log.debug ("java.library.path=" +
-
- System.getproperty ("Java.library.path"));
-
- }
-
- }
-
-
-
- if (!nativecodeloaded) {
-
- Log.warn ("Unable to load Native-hadoop library for your platform ... " +
-
- "using Builtin-java classes where applicable");
-
- }
-
- }
Here are the following Java code error
- DEBUG org.apache.hadoop.util.nativecodeloader-failed to load native-hadoop with error: Java.lang.unsatisfiedlinkerror:hadoop_home\bin\hadoop.dll:can ' t load AMD 64-bit. dll on a IA 32-bit platform
Suspect is a 32-bit JDK problem, replaced with 64 bit, no problem Java code
- 2014-03- :805 DEBUG org.apache.hadoop.util.nativecodeloader-trying to load the Custom-built Native-hadoop library ...
- 2014-03- :812 DEBUG org.apache.hadoop.util.nativecodeloader-loaded the Native-hadoop Library
A common warning Java code is also addressed here
- WARN util. nativecodeloader:unable to load native-hadoop library for your platform ... using Builtin-java classes where appli Cable
Win7 Eclipse Debug CentOS hadoop2.2-mapreduce (GO)