Installation Environment:
- Centos7_x86_64
- JDK8
- Hadoop-2.9.0
Installation steps:
1. Install the configuration jdk.
2. Download Hadoop-2.9.0.
3. Extracting Hadoop
tar zvxf hadoop-2.9. 0. tar. gz
4. Add the bin directory and the Sbin directory of Hadoop to the path.
1 #编辑配置文件 2 vim/etc/profile3#配置环境变量 4 hadoop_home=/usr/local/hadoop-2.9 . 0 5 path= $PATH: $HADOOP _home/bin: $HADOOP _home/sbin6export PATH7 #使指定文件中的配置生效 8 source/etc/profile
Configuration steps:
1. Configure Java_home in the hadoop-env.sh file, using an absolute path.
2. Configure individual files.
Core-site.xml: Used to configure common properties.
Hdfs-site.xml: Used to configure HDFS properties.
Mapred-site.xml: Used to configure the MapReduce attribute.
Yarn-site.xml: Used to configure yarn properties.
Key Configuration items:
component Name |
property name |
Standalone mode |
pseudo-distribution mode |
< strong> Full distribution Mode |
Common |
fs.default.name |
file:/// |
HDFs: localhost/ |
hdfs://namenode/ |
hdfs |
dfs.replication |
N/a |
1 |
3 |
MapReduce 1 |
mapred.job.tracker |
local |
localhost:8021 |
jobtracker : 8021 |
YARN (MapReduce 2) |
yarn.resource.manager.address |
N/a |
l ocalhost:8032 |
resoucemanager:8032 |
1 <!--Core-site.xml -2 <Configuration>3 < Property>4 <name>Fs.default.name</name>5 <value>hdfs://localhost/</value>6 </ Property>7 </Configuration>8 9 <!--Hdfs-site.xml -Ten <Configuration> One < Property> A <name>Dfs.replication</name> - <value>1</value> - </ Property> the </Configuration> - - <!--Mapred-site.xml - - <Configuration> + < Property> - <name>Mapred.job.tracker</name> + <value>localhost:8021</value> A </ Property> at </Configuration> - - <!--Yarn-site.xml - - <Configuration> - < Property> - <name>Yarn.resourcemanager.address</name> in <value>localhost:8032</value> - </ Property> to < Property> + <name>Yarn.nodemanager.aux-services</name> - <value>Mapreduce.shuffle</value> the </ Property> * </Configuration>
Formatting:
1 #格式化文件系统 2 Hadoop Namenode-format
Start Step: 1.Map Reduce1
1 #启动hdfs 2 Start-dfs. sh --config path-to-config-directory 3#启动mapredude 4 start-mapred. sh --config path-to-config-directory 5 6 7#关闭hdfs 8 Stop-dfs. SH 9 #关闭mapreduce Ten stop-mapred. SH
2.Map Reduce2
1 #启动hdfs 2 Start-dfs. sh --config path-to-config-directory 3#启动yarn 4 start-yarn. sh --config path-to-config-directory 5 6 7#关闭hdfs 8 Stop-dfs. SH 9 #关闭yarn Ten Stop-yarn. SH
Linux-hadoop Installation