Installing Hadoop in Linux (pseudo Distribution Mode)

Source: Internet
Author: User

Install Hadoop in Linux (pseudo distribution mode) before writing: when installing Hadoop in Linux, pay attention to permission issues and grant hadoop permissions to non-root users. This article does not cover how to create a new user in Linux. Step 1: Install JDK1. download java. sun. com2. decompress the tar.gz file tar-xzvf jdk-7u17-linux-x64.tar.gz3. move the extracted folder to the/usr/local/jdk-1.7 directory mv/usr/local/src/jdk1.7.0 _ 17/usr/local/jdk-1.74. modify profilevi/etc/profile and append export JAVA_HOME =/usr/local/jdk1.7.0 _ 17 export JRE_HOME = $ JAVA_HOME/jreexport CLASSPATH =.: $ JAVA_HOME/jre/lib/rt. jar: $ JAVA_HOME/lib/dt. jar: $ JAVA_HOME/lib/tools. jar export PATH = $ PATH: $ JAVA_HOME/bin5. make the environment variable effective source /Etc/profile6. view the current jdkupdate-alternatives -- display java7. configure the default jdkupdate-alternatives -- install/usr/bin/java/usr/local/jdk1.7.0 _ 17/bin/java 300update-alternatives -- install/usr/bin/javac/usr/local/jdk1.7.0 _ 17/bin/javac 300update-alternatives -- install/usr/bin/javap/usr/local/jdk1.7.0 _ 17/bin /javap 300update-alternatives -- install/usr/bin/javadoc/usr/local/jdk1.7.0 _ 17/bin/javadoc 300 update-alternatives -- config java if other jdk is installed in the system, a prompt is displayed, select jdk78. test java-version step 2, install SSHD service ubuntu to install sshd command: sudo apt-get install openssh-serverubuntu command to delete sshd: sudo apt-get remove openssh-server step 3, configure password-less Logon (should be executed under a non-root User) ssh-keygencd ~ /. Step 4: Download Hadoop and unzip: tar-zxvf hadoop-1.0.3.tar.gz Mobile: mv hadoop-1.0.3/usr/local/hadoop change folder chown-R hadoop. hadoop creation directory: mkdir/data/hadoop change folder chown-R hadoop. hadoop/data/hadoop step 5, configure Hadoop configuration conf/hadoop-env.sh export JAVA_HOME =/usr/local/jdk-1.7export HADOOP_LOG_DIR =/data/hadoop/logsexport HAD OOP_PID_DIR =/data/hadoop/pids Modify/etc/profile Add: export HADOOP_PID_DIR =/data/hadoop/pids configure core-site.xml fs. default. name hdfs: // master: 8020 hadoop. tmp. dir/data/hadoop/tmp configure hdfs-site.xml dfs. name. dir/data/hadoop/name dfs. data. dir/data/hadoop/data dfs. replication 2 dfs. permissions false hadoop. job. ugi hadoop, supergroup configuration mapred-site.xml mapred. job. tracker master: 8021 mapred. tasktracker. map. tasks. Maximum 2 mapred. map. tasks 2 mapred. tasktracker. reduce. tasks. maximum 2 mapred. reduce. tasks 1 mapred. compress. map. output true mapred. map. output. compression. codec com. hadoop. compression. lzo. lzoCodec step 6, start close hadoopbin/hadoop namenode-format (only required at the first startup for formatting Name nodes) bin/start-all.sh use a browser to access http: // localhost: 50070 and http: // localhost: 50030 check whether Hadoop is successfully started. Bin/stop-all.sh
 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.