windows7+eclipse+hadoop2.5.2 Environment Configuration

Source: Internet
Author: User
Tags hadoop fs

I. Hadoop cluster environment configuration reference my previous post (Ubuntu + hadoop2.5.2 Distributed Environment configuration http://www.cnblogs.com/huligong1234/p/4136331.html)
However, I also modified the following when I configured (because your environment and my may not be consistent, you can later have the relevant problems and then modify): A. On the master node (UBUNTU-V01) modify Hdfs-site.xml plus the following <property> <name>dfs.permissions</name> <value>false</value> </property>
Designed to remove permission checks because I configured the Map/reduce connection to report the following error when I configured Eclipse to connect to the Hadoop server on my Windows machine. Org.apache.hadoop.security.AccessControlException:Permission denied:
B. Also modify hdfs-site.xml on the Master node (UBUNTU-V01) plus the following <property> <name>dfs.web.ugi</name> <value >jack,supergroup</value> </property>
The reason is that at run time, the following error is reported WARN Org.apache.hadoop.security.ShellBasedUnixGroupsMapping:got exception trying to get groups for user Jac K should be my Windows username for jack, no access permissions more permissions configuration See official Documentation: HDFS Rights Management User Guide Http://hadoop.apache.org/docs/r1.0.4/cn/hdfs_ Permissions_guide.html
Restart Hadoop cluster after configuration modification: [email protected]:~/data$./sbin/stop-dfs.sh [email protected]:~/data$./sbin/stop-yarn.sh [email] protected]:~/data$./sbin/start-dfs.sh [Email protected]:~/data$./sbin/start-yarn.sh
Two. Windows Basic Environment Preparation Windows7 (x64), Jdk,ant,eclipse,hadoop
1.JDK Environment Configuration Jdk-6u26-windows-i586.exe Configure the relevant JAVA_HOME environment variable after installation and configure the bin directory to path
2.eclipse environment Configuration Eclipse-standard-luna-sr1-win32.zip extracted to the D:\eclipse\ directory and named eclipse-hadoop:http://  Developer.eclipsesource.com/technology/epp/luna/eclipse-standard-luna-sr1-win32.zip 3.ant Environment Configuration Apache-ant-1.9.4-bin.zip unzip to the D:\apache\ directory, configure the environment variable Ant_home, and configure the bin directory to path:http://mirror.bit.edu.cn/apache//ant/ Binaries/apache-ant-1.9.4-bin.zip
4. Download Hadoop-2.5.2.tar.gz http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.5.2/hadoop-2.5.2.tar.gz
5. Download Hadoop-2.5.2-src.tar.gz http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.5.2/hadoop-2.5.2-src.tar.gz
6. Download Hadoop2x-eclipse-plugin Https://github.com/winghc/hadoop2x-eclipse-plugin
7. Download Hadoop-common-2.2.0-bin Https://github.com/srccodes/hadoop-common-2.2.0-bin
Extract hadoop-2.5.2.tar.gz, hadoop-2.5.2-src.tar.gz, Hadoop2x-eclipse-plugin, hadoop-common-2.2.0-bin download to F:\hadoop\ respectively Directory under
8. Modify the local Hosts file to include the following: 192.168.1.112 UBUNTU-V01
Third, compile the Hadoop-eclipse-plugin-2.5.2.jar configuration 1. Add environment variable hadoop_home=f:\hadoop\hadoop-2.5.2\ append environment variable path content:%hadoop_home%/bin
2. Modify the compilation package and dependent package version information modify F:\hadoop\hadoop2x-eclipse-plugin-master\ivy\libraries.properties hadoop.version=2.5.2 jackson.version=1.9.13
3.ant compiled f:\hadoop\hadoop2x-eclipse-plugin-master\src\contrib\eclipse-plugin> Ant jar-dversion=2.5.2- declipse.home=d:\eclipse\eclipse-hadoop\eclipse-dhadoop.home=f:\hadoop\hadoop-2.5.2
After compiling, the Hadoop-eclipse-plugin-2.5.2.jar will be in the F:\hadoop\hadoop2x-eclipse-plugin-master\build\contrib\eclipse-plugin directory.
IV. Eclipse Environment Configuration 1. Copy the compiled Hadoop-eclipse-plugin-2.5.2.jar to the plugins directory in Eclipse, and then restart Eclipse 2. Open the menu Window--preference--hadoop map/reduce to configure as shown in: 3. Display the Hadoop Connection Configuration window: Window--show view--other-mapreduce Tools, as shown in:
4. Configure the connection to Hadoop as shown in:
To see if the connection is successful, you can see the following information, which indicates a successful connection:
V. Hadoop cluster environment Add test files (without configuration if available)
Create input directory on A.dfs [email protected]:~/data/hadoop-2.5.2$bin/hadoop fs-mkdir-p Input
B. Copy the README.txt from the Hadoop directory to DFS new input [email protected]:~/data/hadoop-2.5.2$bin/hadoop fs-copyfromlocal README.txt input
VI. Create a Map/reduce project 1. New Item File--new--other--map/reduce projects named MR1, Then create the class Org.apache.hadoop.examples.WordCount, copy the overrides from the HADOOP-2.5.2-SRC (F:\hadoop\hadoop-2.5.2-src\ Hadoop-mapreduce-project\hadoop-mapreduce-examples\src\main\java\org\apache\hadoop\examples\wordcount.java)
2. Create the log4j.properties file in the SRC directory to create the Log4j.properties file with the following contents: Log4j.rootlogger=debug,stdout,r log4j.appender.stdout=  Org.apache.log4j.ConsoleAppender Log4j.appender.stdout.layout=org.apache.log4j.patternlayout  log4j.appender.stdout.layout.conversionpattern=%5p-%m%n Log4j.appender.r=org.apache.log4j.rollingfileappender  Log4j.appender.r.file=mapreduce_test.log LOG4J.APPENDER.R.MAXFILESIZE=1MB Log4j.appender.r.maxbackupindex=1  Log4j.appender.r.layout=org.apache.log4j.patternlayout log4j.appender.r.layout.conversionpattern=%p%t%c-%m%n Log4j.logger.com.codefutures=debug
3. Settlement Java.lang.unsatisfiedlinkerror:org.apache.hadoop.io.nativeio.nativeio$windows.access0 (Ljava/lang/String;I) Abnormal problem (because your environment and my may not be consistent, can be later after the relevant problems and then modify) copy the source file Org.apache.hadoop.io.nativeio.NativeIO to the project and then locate to 570 lines, directly modified to return  True As shown in the following:
Seven, under Windows Runtime Environment configuration (if not effective, you need to restart the machine)
Need Hadoop.dll,winutils.exe I am direct copy F:\hadoop\hadoop-common-2.2.0-bin-master\bin directory content overlay F:\hadoop\hadoop-2.5.2\bin
Eight, run project in Eclipse, click Wordcount.java, right click Run As->run configurations, configure the run parameters, that is, the input and output folder Hdfs://ubuntu-v01:9000/ User/hadoop/input Hdfs://ubuntu-v01:9000/user/hadoop/output as shown:
Note: If the output directory already exists, then delete or change a name, such as output01,output02 ...

"Java Framework source code download"

windows7+eclipse+hadoop2.5.2 Environment Configuration

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.