The Hadoop version of this blog is Hadoop 0.20.2.
Installing Hadoop-0.20.2-eclipse-plugin.jar
- To download the Hadoop-0.20.2-eclipse-plugin.jar file and add it to the Eclipse plug-in library, add a method that is simple: Locate the plugins directory under the Eclipse installation directory, copy directly to this directory, and restart eclipse
- Click on the Eclipse Toolbar window-----Show view------Other enter map in the pop-up window to confirm that it looks like the following
Plug-in installation is successful here
Map/reduce Configuration
- Configure Hadoop installation directory
Click Eclipse Window-----Preference, find the Hadoop map/reduce in the pop-up window, select the Hadoop installation file address (the installation file here does not need to be exactly the same as the Hadoop environment in the cluster)
- Hadoop map/reduce Locations Configuration
Under Map/reduce View, click on the label
The popup window will appear as follows, enter the corresponding content as prompted by the figure
In the Advanced Parameters tab, enter the content as follows, here I cut two graphs
Other settings
Verifying the Hadoop map/reduce locations configuration
Under the Project Explorer view of Map/reduce, click the map/reduce locations that you configured under DFS, and the configuration is fine if each node is expandable
Test the WordCount Program
Add the input directory to the HDFs file system
Hadoop Fs-mkdir Input
Refresh Dfs locations in Eclipse and upload the file, here I uploaded two files, the contents of the file to add some space (WordCount according to the space to statistical words)
Run WordCount
Run WordCount need command line parameters, parameters have two, the first is to count the folder HDFs path, the other is the output path;
Note here that the output path is the parent directory of the upload file path, when filling in the DFS locations view to double-click the file, you can view the file's HDFs path, we want his directory, this is Hdfs://192.168.88.128:9000/user/root /input, another output parameter I wrote hdfs://192.168.88.128:9000/user/root/output.
After performing the refresh of DFS locations, you can see the output directory in the input peer directory
Executing commands on the master machine
Hadoop FS-LSR/
You can also see a more output directory, and there are more files below it, this file is the results of the statistics
Time is late, write here first, tomorrow I will upload the relevant plug-ins, but also upload a few Hadoop-related PDF documents
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Step-by-step learning from Me Hadoop (2)----The Hadoop Eclipse plugin to install and run the WordCount program