Just beginning to learn Hadoop, I want to build a friendly to become the environment, in reference to the "Hadoop combat" and online everyone to experience, decided to use Eclipse. Now that you are using eclipse, you must use the Eclipse-hadoop-plugin.
There are two ways to get Eclipse-hadoop plug-ins after hadoop2.0:
1. Compile your own, the specific compilation method can refer to the Internet everyone to blog;
2. Managed by GitHub into the eclipse-hadoop-plugin-2x of the compression package, there is a compiled Plugin.jar, you can directly use, good luck can be successful. Download Address:
After placing the Eclipse-hadoop-plugin.jar under the plugins path under the Eclipse installation directory, I set up a DFS location in accordance with the configuration method in "combat", but it will play an error:
"Map/reduce Location Status Updater". Java.lang.NullPointerException, Virgo, I can't bear it. But after a long period of entanglement, the problem still cannot be solved. Fortunately this warning does not affect the HDFS connection/Add delete directory, but only if Hadoop must be configured perfectly.
I followed the following command under the Hadoop installation directory:
1. sbin/start-all.sh Open Hadoop
2. Bin/hadoop fs-mkdir-p newinput Create a folder
3. Note: When configuring Hadoop location in Eclipse, be sure to fill in the host as an IP address (192.168.21.134), and not write localhost, otherwise the link is not HDF.
Then restart Eclipse, refresh the DFS locations, and find that the directory under HDFs can be displayed. The next step is to run wordcount in the eclipse environment, hoping to succeed. (Prepare reference http://booby325.iteye.com/blog/1309940)
Reference blog:
1. http://blog.csdn.net/hongweigg/article/details/7197662
2. http://www.cnblogs.com/flyoung2008/archive/2011/12/09/2281400.html