Hadoop pseudo-distribution installation steps, hadoop Installation Steps

Source: Internet
Author: User
Tags tmp file

Hadoop pseudo-distribution installation steps, hadoop Installation Steps
2. steps for installing hadoop pseudo-distribution: 1.1 set the static IP address icon in the upper-right corner of the centos desktop, right-click to modify and restart the NIC, and run the Command service network restart for verification: ifconfig 1.2 modify the host name <1> modify the host name in the current session, execute the command hostname hadoop <2> modify the host name in the configuration file, and execute the command vi/etc/sysconfig/network for verification: restart machine 1.3 and bind the hostname and ip address. Run the command vi/etc/hosts and add a line 192.168.80.100 hodoop for verification: ping hadoop 1.4 to disable the firewall and run the Command service iptables stop for verification: service iptables status 1.5 shut down the firewall automatically run command chkconfig iptables off verification: chkconfig -- list | Grep iptables 1.6 SSH (secure shell) <1> execute the command ssh-keygen-t rsa to generate the key located in ~ /. Ssh <2> Run Command cp ~ /. Ssh/id_rsa.pub ~ /. Ssh/authorized_keys Verification: install jdk in ssh localhost 1.7 <1> run the rm-rf * command to delete all files <1> Use winscp to copy the jdk file to the/usr/local directory <2> RUN chomd u + x jdk -..... grant permissions <3> execute commands. /jdk .... decompress <4> run the command mv jdk -.... Jdk rename <5> Execute Command vi/etc/profile to set the environment variable, add two lines of content: export JAVA_HOME =/usr/local/jdk export PATH =.: $ JAVA_HOME/bin: $ PATH save exit Execute Command source/etc/profile to make the settings take effect verification: java-version1.8 install hadoop 1. execute the command tar-zxvf hadoop-1 ..... 2. execute the command mv rename 3. run the command vi/etc/profile to set the environment variable and add one line of content: export JAVA_HOME =/usr/local/jdk export PATH =.: $ JAVA_HOME/bin: $ PATH save and exit. Run the command source/etc/profile to make the settings take effect. 4. modify the hadoop configuration file at $ HADOOP_HOME /Conf hdfs-site.xml hadoop-env.sh mapred-site.xml core-site.xml 4 files 5. format hadoop execution command hadoop namenode-format 6. start hadoop Execute Command start-all.sh verification :( 1) Execute Command jps to find 5 java Process distribution is namenode .... (2) Access http: // hadoop: 50070/dfshealth through a browser. jsp http: // hadoop: 50030/jobtracker. modify C in windows Using jsp: \ WINDOWS \ system32 \ drivers \ etc host 1.9 NameNode process not started no formatting configuration file only copy not modifying hostname not binding to Ip ssh password-free configuration not configured successfully 2.0 multiple formatting hadoop is also incorrect? Method to delete the/usr/local/hadoop/tmp file and format it again

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.