JDK Installation
Tar ZXVF JDK MV Jdk/usr/lib/jvm/java
JDK environment variable Configuration
vim/etc/profile' export java_home=/usr/lib/javaexport jre_home=${java_home}/jreexport CLASSPATH=.:${java_home}/lib:${jre_home}/libexport PATH=${path}:${java_home}/bin:${jre_home}/ Binexport hadoop_home=/usr/lib/hadoopexport PATH= $PATH: $HADOOP _home/bin: $HADOOP _home/sbin "
Installing hadoop2.6.4
Tar ZXVF Hadoop MV /usr/lib
Hadoop environment variable Configuration
Export hadoop_home=/usr/lib/hadoopexport PATH= $PATH: $HADOOP _home/bin: $HADOOP _home/sbin
Hadoop stand-alone configuration
cd/usr/lib/hadoop/etc/HadooplsVim Core-site.xml ""<?xml version="1.0"encoding="UTF-8"? ><?xml-stylesheet type="text/xsl"href="configuration.xsl"?><!--Put Site-specific property overridesinchThisfile. --><configuration><property> <name>fs.defaultFS</name> <value>hdfs://192.168.128.129:9000</value></property> <property> <name>hadoop.tmp.dir</name> <value>/usr/lib/hadoop/tmp</value> </property></configuration>"' Vim HDFs-site.xml ""<?xml version="1.0"encoding="UTF-8"? ><?xml-stylesheet type="text/xsl"href="configuration.xsl"?><!--Put Site-specific property overridesinchThisfile. --><configuration><property> <name>dfs.replication</name> <value>1</value> </property><property> <name>dfs.permissions</name> <value>false</value></property><property> <name>dfs.datanode.max.transfer.threads</name> < Value>8192</value> <description>Specifies the maximum number of threads to use forTransferring DatainchAnd out of the DN. </description></property><property><name>dfs.data.dir</name><value>/usr/lib/hadoop/hdfs/data</value></property></configuration>"' Vim mapred-site.xml ""<?xml version="1.0"? ><?xml-stylesheet type="text/xsl"href="configuration.xsl"?><!--Put Site-specific property overridesinchThisfile. --><configuration><property> <name>mapreduce.framework.name</name> <value>y Arn</value> </property></configuration>"' Vim yarn-site.xml ""<?xml version="1.0"?><configuration><!--Site Specific YARN Configuration Properties--<property> <NAME&G T;yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property></ Configuration>"' Vim Hadoop-Env.SH"# The Java implementation to Use.export Java_home=/usr/lib/java/Jreexport Hadoop_prefix=/opt/Hadoop "Vim yarn-Env.SH"' Export java_home=/usr/lib/java/JRE " "
Start Hadoop When the above configuration is complete
/opt/hadoop/sbin# starts HDFs. /start-dfs. SH #启动yarn. /start-yarn. SH ```
Hadoop Password-free boot
"' Method one:? Under command Terminal, enter the following command:? (Note: Not relevant to the current directory)Ssh-keygen-t RSA? Follow the instructions in the brackets below (Enterfile inch whichTo save the key (/home/youruser/.SSH/Id_rsa): (Note: Press ENTER here to accept the default file name) Enter passphrase (empty forno passphrase): (Note: Press ENTER here does not set RSA private key encryption) Enter same passphrase again: (Note: Press ENTER here does not set RSA private key encryption) Your Identifica tion has been savedinch/home/youruser/.SSH/Id_rsa. Your public key has been savedinch/home/youruser/.SSH/id_rsa.pub.)? CD~/.SSH/? Cat IDRsa.pub >>Authorized_keyschmod -Authorized_keys (Note: Online introduction of the method generally does not have this line, but on my machine if not add this line is not successful)?? ```
hadoop2.6.4 "Ubuntu" stand-alone Environment Building series 1