Hadoop environment setup under Eclipse Tools:
Build the Eclipse64 development system in the WINDOW10 operating system and configure the Eclipse plugin for Hadoop to allow Eclipse to view the contents of the file in HDFs.
1. Move the Hadoop-eclipse-plugin-2.5.2.jar file to the Plugins folder in Eclipse.
2. Restart the Eclipse tool. After you open the Eclipse tool, find the following location to open Map/reduce.
3. Switch the view below the MapReduce view to configure the connection information.
Attention:
A, the location name of their own random names.
B, host is the IP address of the hosts you want to connect to.
C, the first port=50020 by default, the second port is the port number that is configured when you configure the core-site.xml file yourself.
You can then view the file information above the HDFs as follows:
4. Upload and download to verify if the installation is successful.
Note: The uploaded file must be in utf-8 format, the window save text is in ANSI format by default, so you need to be aware of it.
When uploading a file to HDFs, you encounter the following problem: Permission denied:user=,access=write,inode= "". And the Create folder is also not displayed, follow the prompts should be the issue of permissions. The workaround is as follows:
The workaround is as follows, remembering to create a file or folder to remember refresh, not automatically refresh the structure after the fix :
<property> <name>dfs.permissions</name> <value>false</value> </property>
If the uploaded file is in utf-8 format, but it is garbled after Eclipse is opened, make the following changes:
Hadoop environment built under Eclipse tools