Experiment two-1 win7 install the Hadoop plugin for Eclipse

Source: Internet
Author: User

The tutorials used are:

1. windows7+eclipse+hadoop2.5.2 Environment Configuration

Http://www.cnblogs.com/huligong1234/p/4137133.html

2. Hadoop uses eclipse to build the Hadoop development environment under the WINDOWS7 operating system

Http://www.linuxidc.com/Linux/2014-12/111061.htm

3. eclipse_win7_hadoop1.2.1 Development Environment Construction 1

http://blog.csdn.net/majian_1987/article/details/23941603

1. Download and install the latest version of Eclipse IDE for Java EE Developers (Java environment configuration does not repeat here).

http://www.eclipse.org/downloads/download.php?file=/technology/epp/downloads/release/luna/SR2/ Eclipse-jee-luna-sr2-win32-x86_64.zip

2. Download the previously compiled 64-bit hadoop-2.5.2.tar.gz and download it as hadoop-2.5.2-src.tar.gz as the official website, and extract it to E:\hadoop. And hadoop2.5.2 the old flock (x64). zip is also extracted to the directory, such as:

3. Add environment variable Hadoop_home=e:\hadoop\hadoop-2.5.2\

Append environment variable path content:%hadoop_home%/bin

The above variables are added to the system variables rather than the user variables. -------- (cause unknown)

(later configuration paoding environment variables are configured in the Hadoop In the system rather than in the virtual machine system ------------------------------------------------ The reason is still unclear)

4. Download the plugin Hadoop-eclipse-plugin-2.5.2.jar and copy it to "plugins" in Eclipse's directory, then re-eclipse to take effect.

5. Configure the Hadoop installation director. Open the menu Window--preference--hadoop map/reduce to configure, set the Hadoop installation path on the right side of the window. As shown in the following:

6. Configure Map/reduce Locations

Open Windows-->open Perspective-->other

Select Map/reduce, click OK, and at the bottom right you see an icon with a map/reduce locations, as shown in:

7. Click on the Map/reduce Location tab and click on the icon on the right to open the Hadoop Location Configuration window: Enter location name and any name. Configure Map/reduce Master and DFS Mastrer, Host and Port are configured to match the settings of the Core-site.xml.
To find Core-site.xml configuration:

Fs.default.name hdfs://master:9000

The interface is configured as follows:

Then click "Advanced Parameters" to see "Hadoop.tmp.dir", modify the address set in our Hadoop cluster, our Hadoop cluster is "/home/hadoop/tmp", this parameter in the " Core-site.xml "is configured. ----------------- This step looks like others did not do, also succeeded, about Advanced Parameters configuration needs to be further studied.

The book says that the configuration user and user group content is not found here, is to modify the Hdfs-site.xml on the master node. See the next page for specific methods.

8. Review the HDFs file system and try to create folders and upload files. Click "hadoop2.5.2" under "DFS Locations" on the left side of the Eclipse software to show the file structure on HDFs.

An error has been made. The solution is as follows:

A. Modify hdfs-site.xml on the master node plus the following:

<property>

<name>dfs.permissions</name>

<value>false</value>

</property>

Designed to remove permission checks because I configured the Map/reduce connection to report the following error when I configured Eclipse to connect to the Hadoop server on my Windows machine. Org.apache.hadoop.security.AccessControlException:Permission denied:

This configuration allows you to operate on the eclipse side without having to pack and upload to Hadoop.

B. Also modify the Hdfs-site.xml on the master node plus the following:

<property>

<name>dfs.web.ugi</name>

<value>Administrator,supergroup</value>

</property>

The reason is that the following error is reported at run time WARN Org.apache.hadoop.security.ShellBasedUnixGroupsMapping:got exception trying to get groups for user ja Ck

Should be my Windows user name is Jack, no access rights

More permission configurations can be found in the official documentation:

HDFS Rights Management User Guide http://hadoop.apache.org/docs/r1.0.4/cn/hdfs_permissions_guide.html

Restart Hadoop cluster after configuration modification: (It seems better to stop and then modify the configuration and restart)

stop-dfs.sh

stop-yarn.sh

sbin/start-dfs.sh

sbin/start-yarn.sh

After the problem is resolved, no more errors, as follows:

Right-click to try to create a "folder--xiapi" and then right-click the refresh to see the folder we just created. You can right-upload a file, refresh the display, and look it up on master, which means the configuration is complete.

Experiment two-1 win7 install the Hadoop plugin for Eclipse

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.