Hadoop Learning One: Hadoop installation (hadoop2.4.1,ubuntu14.04)

Source: Internet
Author: User
Tags tmp folder

1. Create a user

AddUser HDUser

To modify HDUser user rights:

sudo vim/ect/sudoers, add HDUser all= (All:all) all in the file.

  

2. Install SSH and set up no password login

1) sudo apt-get install Openssh-server

2) Start service: SUDO/ETC/INIT.D/SSH start

3) Check that the service is started correctly: Ps-e | grep ssh

  

4) Set password-free login, generate private key and public key

Ssh-keygen-t rsa-p ""

Cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

  

5) Password-free login: ssh localhost

6) Exit

3. Configuring the Java Environment

1) Download: jdk-8u25-linux-x64.tar.gz

2) Decompression: TAR-XZVF jdk-8u25-linux-x64.tar.gz

3) sudo mv jdk1.8.0_25/usr/local/

4) Set environment variables: sudo vim/etc/profile (Global settings file, can also set ~/.bashrc file,~/.bashrc file settings only for that user's settings file), add at the end:

Export java_home=/usr/local/jdk1.8.0_25
Export JRE_HOME=${JAVA_HOME}/JRE
Export Classpath=.:${java_home}/lib:${jre_home}/lib
Export Path=${java_home}/bin: $PATH

  

5) Source/etc/profile

6) Test success: Java-version

  

4.hadoop stand-alone installation

1) Download

2) Unzip: sudo tar-xzvf hadoop-2.4.0.tar.gz

3) sudo mv hadoop-2.4.0/usr/local/

4) sudo chmod 774 hadoop-2.4.0

5) Vim. BASHRC, add at the end of the file

Export Java_home=/usr/local/jdk1.8.0_25 (Choose your own JAVA installation path)

Export hadoop_install=/usr/local/hadoop-2.4.1 (own HADOOP installation path)

Export path= $PATH: $HADOOP _install/bin

Export path= $PATH: $HADOOP _install/sbin

Export Hadoop_mapred_home= $HADOOP _install

Export Hadoop_common_home= $HADOOP _install

Export Hadoop_hdfs_home= $HADOOP _install

Export Yarn_home= $HADOOP _install

Export hadoop_common_lib_native_dir= $HADOOP _install/lib/native

Export hadoop_opts= "-djava.library.path= $HADOOP _install/lib"

  

6) source. BASHRC

7) Enter/usr/local/hadoop-2.4.1/etc/hadoop directory, configure hadoop-env.sh

Vim hadoop-env.sh, fill in its own Java path, Hadoop configuration path (single machine does not modify the Hadoop configuration path does not affect, pseudo-distributed must be modified)

  

8) Source hadoop-env.sh stand-alone mode configuration complete

5. Pseudo-Distributed configuration: (Enter/usr/local/hadoop-2.4.1/etc/hadoop)

1) Configure Core-site.xml:vim Core-site.xml, add:

Create the TMP folder under/usr/local/hadoop-2.4.1: mkdir tmp

<property>
<name>hadoop.tmp.dir</name>
<value>file:/usr/local/hadoop-2.4.1/tmp</value>
<description>abase for other temporary directories.</description>
</property>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>

  

2) Configure Hdfs-site.xml:vim Hdfs-site.xml, add:

Create a folder under/usr/local/hadoop-2.4.1: mkdir HDFs, mkdir hdfs/name, mkdir hdfs/data

<configuration>

<property>
<name>dfs.replication</name>
<value>1</value>
</property>

<property>
<name>dfs.namenode.name.dir</name>
<value>file:/usr/local/hadoop-2.4.1/hdfs/name</value>
</property>

<property>
<name>dfs.datanode.data.dir</name>
<value>file:/usr/local/hadoop-2.4.1/hdfs/data</value>
</property>

</configuration>

  

3) Configure Yarn-site.xml:vim Yarn-site.xml, add:

<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>

<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
</configuration>
  

4) configuration mapred-site.xml: CP mapred-site.xml.template Mapred-site.xml, vim mapred-site.xml , add:

<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>

  

5) Format Hdfs:hdfs Namenode-format

6) Execute Start command: sbin/start-dfs.sh sbin/start-yarn.sh

7) JPS view Java-related processes:

  

8) Hadoop Management interface: http://localhost:50070/

  

9) Hadoop Process Management interface: http://localhost:8088

  

Hadoop Learning One: Hadoop installation (hadoop2.4.1,ubuntu14.04)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.