Ubuntu10.4 installation configuration Hadoop-0.20.203.0 getting started

Source: Internet
Author: User
1. install Sun's jdk1.6 and the JAVA_HOME environment variable has been directed to the jdk installation directory. (For details, refer to manual installation of SUN's jdk1.6 under Ubuntu10.4.) 2. download the stable version of hadoop installation package and decompress it to the/opt/directory. run $ sudogedi in the command line... 1. install Sun jdk1.6 and the JAVA_HOME environment variable has been directed to the jdk installation directory. (For details, refer to manual installation of SUN's jdk1.6 under Ubuntu10.4 [finishing])
2. download the stable version of hadoop installation package and decompress it to the/opt/directory.
3. run the command line
$ Sudo gedit/etc/profile add at the end of the file
# Set Hadoop environment
Export HADOOP_INSTALL =/opt/hadoop-0.20.203.0
Export PATH = $ PATH: $ HADOOP_INSTALL/bin set the environment variable of the Hadoop installation location
4. run the hadoop version command to test whether the command is successful.
5. configuration:
1) Independent mode:
In this mode, no special configuration operation is required, you only need to set the HAVA_HOME environment variable of java jdk in the hadoop-env.sh in the/conf Directory
2) pseudo-distributed mode:
In this mode, you need to configure 3 profiles (core-site.xml hdfs-site.xml mapred-site.xml under the conf/directory)




Fs. default. name
Hdfs: // localhost/




Dfs. replication
1




Mapred. job. tracker
Localhost: 8021


 
 
6. configure SSH www.2cto.com
% Sudo apt-get install ssh
Create a new SSH key based on a blank password to start password-less login
% Ssh-keygen-t rsa-p'-f ~ /. Ssh/id_rsa
% Cat ~ /. Ssh/id_rsa.pub> ~ /. Ssh/authorized_keys
Run the following command to test:
% Ssh localhost
If the operation succeeds, you do not need to enter the password.
7. start and end the daemon process
% Start-dfs.sh
% Start-mapred.sh
The local computer starts three daemon, one namenode, one secondary namenode, and one datanode.
In http: // localhost: 50030/view jobtracker or in http: // localhost: 50070/view the jps command of namenode java, you can also check whether the daemon process is running.
% Stop-dfs.sh
% Stop-mapred.sh
8. Format The HDFS file system:
% Hadoop namenode-format,


From a strange blog
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.