Hadoop environment setup under Mac (single node)

Source: Internet
Author: User
Tags xsl

I. Installing Java
1. Download and install the JDK, I downloaded the 1.8.0_45 URL: http://www.oracle.com/technetwork/java/javase/downloads/ Index-jsp-138363.html is as follows: Then install, the default installation path is:/library/java/javavirtualmachines/jdk1.8.0_45.jdk/contents/ Home2. Test whether the installation succeeded in Terminal input:
Java-version
If the installation is successful, the appropriate Java version is displayed.
Two. Download and install the Hadoop URL: http://mirrors.cnnic.cn/apache/hadoop/common/i downloaded the stable1 hadoop-1.2.1.tar.gz, which is the stable version of 1. Then unzip to the Users/liu Hadoop folder with the path:/users/liu/hadoop/hadoop-1.2.1 as follows: three. Hadoop configuration 1. Configuration hadoop-env.sh
in the hadoop->conf directory, locate hadoop-env.sh, and open the edit to the following settings: Export java_home=/library/java/javavirtualmachines/jdk1.8.0_ 45.jdk/contents/home (Remove the previous comment #) Note that the path to Java_home is correct.
Export hadoop_heapsize=2000 (Remove the previous comment #) Note: Some blogs write that you need to comment out the next line
export hadoop_opts= "-djava.security.krb5.realm=ox. ac.uk-djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk "(remove comments) I didn't find this one, so I didn't have this one.
2. Configuration core-site.xml--Specifies the hostname and port of the Namenode
<span style= "" ><?xml version= "1.0"? ><?xml-stylesheet type= "text/xsl" href= "configuration.xsl"?> <!--Put Site-specific property overrides in the this file. --><configuration><property><name>fs.default.name</name><value>localhost:9000 </value></property></configuration></span>
<span style= "" ><span style= "FONT-SIZE:14PX;" >3.<span style= "font-family:arial; line-height:26px; Color:rgb (51, 51, 51); " ><span style= "Color:rgb (255, 0, 0); > Configuration hdfs-site.xml--Specifies the number of default parameter replicas for HDFs because it runs on only one node, so the number of replicas here is 1</span></span></span></span >
<span style= "font-family:arial; line-height:26px; " ></span><pre name= "code" class= "HTML" ><span style= "" ><?xml version= "1.0"? ><? Xml-stylesheet type= "text/xsl" href= "configuration.xsl"?><!--Put site-specific property overrides in this file. --><configuration><property><name>dfs.replication</name><value>1</value ></property></configuration></span>
4. Configuration mapred-site.xml--Specifies the hostname and port of the Jobtracker
<?xml version= "1.0"? ><?xml-stylesheet type= "text/xsl" href= "configuration.xsl"?><!--Put Site-specific property overrides the this file. --><configuration><property><name>mapred.job.tracker</name><value>localhost :9001</value></property></configuration>

5.SSH configuration turn on sharing in System Preferences, check remote logins, and select all users. Then the following method can be SSH-free login terminal input:
Ssh-keygen-t Dsa-p "-F ~/.SSH/ID_DSA
At this point in the terminal input:
LS ~/.ssh
You can see that the folder has ID_DSA and Id_dsa.pub, a pair of SSH private keys and public keys. Next, append the public key to the authorized key, and then the terminal enters:

<span style= "FONT-SIZE:14PX;" >cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys</span>
You can skip the landing.
6. Setting Environment variables

Before you start Hadoop, there are three files that need to be configured.

Export hadoop_home=/users/liu/hadoop/hadoop-1.2.1 (set according to your own directory)

Export path= $PATH: $HADOOP _home/bin Note: Export settings are only valid for the current bash login session. This is present in memory. If you are in trouble, just write to the profile file in etc as well.

Four. Testing

1. Testing

After completing the above setup, enter in the terminal:

$HADOOP _home/bin/hadoop Namenode-format

If the following conditions occur:


Indicates success.

2. Start Hadoop

    $HADOOP _home/bin/start-all.sh
<span style= "font-family: ' Microsoft Yahei '; "> If the following conditions occur:</span>


Then Hadoop starts successfully.

Hadoop environment setup under Mac (single node)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.