Required software and version: JDK-7U80-LINUX-X64.TAR.GZHADOOP-2.6.0.TAR.GZ1. Installing JDK Hadoop is required to run under the JDK, note that the JDK is best to use Oracle's otherwise possible BUG2. create User [[ Email protected] ~]# groupadd Hadoop [[email protected] ~]# useradd-g hadoop Hadoop [[Email protecte D] ~]# passwd Hadoop 3. Configure SSH to confirm that the client and server side of SSH are installed [[email protected] ~]$ rpm-qa|grep ssh libssh2-1.4.2-1.el6.x8 6_64 openssh-clients-5.3p1-94.el6.x86_64 openssh-server-5.3p1-94.el6.x86_64 openssh-5.3p1-94.el6.x86_64 [[Emai L protected] ~]$ configuration ssh password-free login [[email protected] ~]$ mkdir ~/.ssh [[email protected] ~]$ CD ~/.ssh/ # If you do not have this directory, first execute SSH localhost [[email protected]. ssh]$ ssh-keygen-t DSA # There will be a hint, just press ENTER to [[email protected]. ssh]$ cat id_dsa.pub >> Authorized_keys # join authorization [[email  ;p rotected]. ssh]$ chmod./authorized_keys # Modify file permissions, if not changed, cannot be passed, as if Cent OS has strict permissions validation [[email protected] . SSH] $ssh MyDB01 # Test 4. Install Hadoop download [[email protected] ~]$ wget http://archive.apache.org/dist/hadoop/core/hadoop-2. 6.0/hadoop-2.6.0.tar.gz decompression [[email protected] Hadoop] #tar-zxvf hadoop-2.6.0.tar.gz-c/usr/local Modify Permissions [[email protected] local]# mv hadoop-2.6.0/hadoop/#更改文件夹名称 [[email protected] local]# chown-r Hadoop:hadoop./hadoop #修改权限 Modifying environment variables [[email protected] ~]$ vi. bash_profile export hadoop_home=/usr/local /hadoop export hadoop_install= $HADOOP _home export hadoop_mapred_home= $HADOOP _home export Hadoop_common_home = $HADOOP _home export hadoop_hdfs_home= $HADOOP _home export yarn_home= $HADOOP _home export hadoop_common_lib_n ative_dir= $HADOOP _home/lib/native Export path= $PATH: $HADOOP _home/sbin: $HADOOP _home/bin Export Java_home=/usr/lib /jvm/java-1.7.0-openjdk-1.7.0.111.x86_64 Export path= $JAVA _home/bin: $PATH export classpath=.: $JAVA _home/lib/dt.ja R: $JAVA _HOME/LIB/TOOLS.JAR 5. Configuring pseudo-distributed #手动创建根目录/hadoop: Hadoop 1, core-site.xml files <Configuration> < Property> <name>Hadoop.tmp.dir</name> <value>File:/hadoop/tmp</value> </ Property> < Property> <name>Fs.defaultfs</name> <value>hdfs://mydb01:9000</value> </ Property> </Configuration>2. hdfs-site.xml file<Configuration> < Property> <name>Dfs.replication</name> <value>1</value> </ Property> < Property> <name>Dfs.namenode.name.dir</name> <value>File:/hadoop/tmp/dfs/name</value> </ Property> < Property> <name>Dfs.datanode.data.dir</name> <value>File:/hadoop/tmp/dfs/data</value> </ Property> </Configuration>6. File System HDFs format operation Hadoop Namenode-format
Hadoop pseudo-Distributed build CentOS