Teach you how to install Hadoop under Cygwin64 in Win7

Source: Internet
Author: User
Tags xsl lenovo

First we need to prepare the following environment and software:

1.7. 9-1jdk-6u25-windows-x64.ziphadoop-0.20. 2. tar.gz

1. Install the JDK properly on the Win7 system, while keeping in mind that the variables for the Java environment are set up:

The main variables include: Java_home,path,classpath

(Please bring your own ladder if not set)

2. Next is the installation of Hadoop, I am currently installing version 0.20.2, for convenience,

For the time being, I put it directly into the Cygwin64 's/home directory (under normal circumstances, please put it in the/usr directory),

and use the TAR command to extract the operation.

[Email protected]/-zxvf hadoop-0.20. 2. tar.gz

3. Optical installation of Hadoop is not enough, but also need some simple configuration work, the main configuration file has 4,

They are located in the Conf subdirectory of the Hadoop installation directory, respectively:

hadoop-env.shcore-site.xmlhdfs-site.xml mapred-site.xml

Here's how to modify the detail section:

(1) Modify the hadoop-env.sh file:

This is a simple step, just modify the Java_home to the JDK's installation directory:

The red mark is the modified look .

# Set hadoop-Specific environment variables here.# the only required environment variable isJava_home.  All others are# optional. When running a distributed configuration it isBest to#SetJava_homeinch  ThisFile, so, it iscorrectly defined on# remote nodes.# the Java implementation to use. Required. Export Java_home =/cygdrive/d/android/java/jdk1.7 . 0_15# Extra Java CLASSPATH elements. optional.# Export Hadoop_classpath=# The maximum amount of heap to use,inchMB. Default is  +. # Export Hadoop_heapsize= -# Extra Java runtime options. Empty bydefault. # Export Hadoop_opts=-server# Command Specific options appended to hadoop_opts when Specifiedexport hadoop_namenode_opts="-dcom.sun.management.jmxremote $HADOOP _namenode_opts"Export Hadoop_secondarynamenode_opts="-dcom.sun.management.jmxremote $HADOOP _secondarynamenode_opts"Export Hadoop_datanode_opts="-dcom.sun.management.jmxremote $HADOOP _datanode_opts"Export Hadoop_balancer_opts="-dcom.sun.management.jmxremote $HADOOP _balancer_opts"Export Hadoop_jobtracker_opts="-dcom.sun.management.jmxremote $HADOOP _jobtracker_opts"# Export Hadoop_tasktracker_opts=# The following applies to multiple commands (FS, DFS, fsck, distcp etc) # export hadoop_client_opts# Extra SSH options  . Empty bydefault. # Export Hadoop_ssh_opts="- o connecttimeout=1-o sendenv=hadoop_conf_dir"# Where log files are stored. $HADOOP _home/logs bydefault. # Export Hadoop_log_dir=${hadoop_home}/logs# File naming remote slave hosts. $HADOOP _home/conf/slaves bydefault. # Export Hadoop_slaves=${hadoop_home}/conf/slaves# Host:pathwhereHadoop code should be rsync'd from. Unset by default.# Export hadoop_master=master:/home/$USER/src/hadoop# Seconds to sleep between slave commands. Unset bydefault. this# can usefulinchLarge clusters,where, e.g, slave rsyncs can# otherwise arrive faster than the master can service them.# export Hadoop_slave_sleep=0.1# The DirectorywherePID files are stored. /tmp bydefault. # Export Hadoop_pid_dir=/var/hadoop/pids# AstringRepresenting ThisInstance of Hadoop. $USER bydefault. # Export Hadoop_ident_string=$USER # The scheduling priority forDaemon processes. See'Manly'. # Export Hadoop_niceness=Ten

(Note: The path here cannot be a Windows-style directory d:\java\jdk1.7.0_15, but Linux-style/cygdrive/d/java/jdk1.7.0_15)

(2) Modify Core-site.xml:

The red flag is the added code.

<?xml version="1.0"? ><?xml-stylesheet type="text/xsl" href="configuration.xsl" in the this file. --><configuration><property>    <name>fs.default.name</name>    <value >hdfs://localhost:9000</value></property></configuration>

(3) Modify Hdfs-site.xml (Specify a copy of 1)

The red flag is the added code .

<?xml version="1.0"? ><?xml-stylesheet type="text/xsl" href="configuration.xsl" in the this file. --><configuration><property>    <name>dfs.replication</name>    <value >1</value>  </property> </configuration>

(4) Modify Mapred-site.xml (Specify Jobtracker)

The red flag is the added code.

<?xml version="1.0"? ><?xml-stylesheet type="text/xsl" href="configuration.xsl" in the this file. --><configuration><property>    <name>mapred.job.tracker</name>    < value>localhost:9001</value>  </property>  </configuration>

4. Verify that the installation is successful and running Hadoop

(1) Verifying the installation

$bin/ Hadoopusage:hadoop [--Config Confdir] COMMANDwhereCOMMAND isOne Of:namenode-format Format The DFS filesystem Secondarynamenode run the DFS secondary namenode namenode run th              E dfs namenode Datanode run a Dfs datanode dfsadmin run a DFS admin client mradmin Run a Map-Reduce Admin Client fsck run a DFS filesystem checking utility FS run a generic fi Lesystem User Client balancer run a cluster balancing utility Jobtracker run the MapReduce job Tra                  Cker node pipes run a pipes job Tasktracker run a MapReduce task Tracker node job Manipulate MapReduce Jobs QueueGetinformation regarding Jobqueues version print the version jar<jar>run a jar file distcp<srcurl> <desturl>copy file or directories recursively archive-archivename NAME <src>* <dest>Create a Hadoop archive DaemonlogGet/SetThe log level forEach daemon or CLASSNAME run theclassnamed Classnamemost commands print help when invoked W/o parameters.

(2) format and start Hadoop

$ bin/hadoop Namenode–format the/ -/ the Ten: -:WuyiINFO Namenode. Namenode:startup_msg:/************************************************************startup_msg:starting namenodestartup_msg:host = Lenovo-pc/192.168.41.1startup_msg:args = [? Cformat]startup_msg:version = 0.20.2startup_msg:build =https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20- R 911707; compiled by ' chrisdo ' on Fri Feb 08:07:34 UTC 2010***************************************************** *******/Usage:java NameNode [-format] | [-upgrade] | [-rollback] | [-finalize] | [-Importcheckpoint] the/ -/ the Ten: -:WuyiINFO Namenode. Namenode:shutdown_msg:/************************************************************shutdown_msg:shutting down NameNode at lenovo-PC/ 192.168.41.1************************************************************/
$ bin/start- all.shstarting Namenode, logging to/home/hadoop-0.20.2/bin/. /logs/hadoop-lenovo-namenode-lenovo-pc. outlocalhost:/home/hadoop-0.20.2/bin/slaves.sh:line A: Ssh:command not foundlocalhost:/home/hadoop-0.20.2/bin/slaves.sh:line A: Ssh:command not foundstarting jobtracker, logging to/home/hadoop-0.20.2/bin/. /logs/hadoop-lenovo-jobtracker-lenovo-pc. outlocalhost:/home/hadoop-0.20.2/bin/slaves.sh:line A: Ssh:command not found

(3) View Hadoop

Command line view:

$ JPS 6948 Jobtracker 9008 Jps 6748 NameNode

(Note: The Datenode and tasktracker processes in the Win7 Cygwin cannot be displayed, it should be the problem of Cygwin)

Now you can see the effect of the page:

http://localhost:50030

http://localhost:50070

(4) Turn off Hadoop

bin/stop-all.sh

Copyright statement: This article is part of the content is to refer to the online information, if you have any questions please contact, thank you for your cooperation.

Teach you how to install Hadoop under Cygwin64 in Win7

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.