Hadoop installation in centos

Source: Internet
Author: User
Hadoop installation is not difficult, but requires a lot of preparation work. 1. JDK needs to be installed first. Centos can be installed directly through Yum install java-1.6.0-openjdk. The installation methods for different release versions may be different. 2. After setting SSH, you must set SSH as the key for logon authentication. If you do not have this step, you will be prompted to enter the password each time hadoop runs. You can directly press ssh-keygen-t rsa to generate an SSH key pair and enter the current user's home directory. SSH directory, CP/home/hadoop /. SSH/id_rsa.pub/home/hadoop /. SSH/authorized_keys. This step aims to allow SSH to directly pass key authentication when logging on to the Local Machine (that is, localhost. If you want to build a hadoop cluster, you also need to use SCP/home/hadoop /. SSH/authorized_keys [email protected]:/home/hadoop /. SSH/commands exchange public keys on each node to implement key authentication between nodes. 3. hadoop installation is followed by hadoop installation. This step is perhaps the simplest. Download the hadoop installation package directly on the hadoop official website and decompress it to complete hadoop installation. This is similar to the green software under Win. After decompression, you can find the hadoop executable file and configuration file in the directory, and run the executable file directly to run hadoop. The runtime should report an error, this is normal, do not panic, first set the hadoop-env.sh file, this file can be found in the hadoop directory etc folder. Different Versions of hadoop may be different, you can find the location of this file by finding the hadoop installation directory-name "hadoop-env.sh. Find the $ java_home variable in it and set its value to the directory where the "Java" executable file is located. If you do not know which directory the variable is located, find it directly. Then you can run hadoop again. You can also add the Home Directory of hadoop to the path directory, so that you can execute hadoop commands in any directory. Modify the/etc/profile file and append the hadoop installation directory to the end of the file:

Export hadoop_home =/usr/local/hadoop/hadoop-0.21.0
Export Path = $ hadoop_home/bin: $ path

After the preceding steps are completed, hadoop is installed. If you want the local mode, hadoop is ready for use. However, if other modes are used, you need to configure them. The specific configuration method can be Baidu directly.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.