One: stand-alone version
1.sudo gedit ~/.BASHRC Join JDK Path
#HADOOP VARIABLES START export Java_home=/usr/lib/jvm/java-1.7.0-openjdk-amd64export hadoop_install=/home/ sendi/hadoop-2.6.0 export path= $PATH: $HADOOP _install/bin export path= $PATH: $HADOOP _install/sbin Export Hadoop_mapred_home= $HADOOP _install export hadoop_common_home= $HADOOP _install export hadoop_hdfs_home=$ Hadoop_install export yarn_home= $HADOOP _install export hadoop_common_lib_native_dir= $HADOOP _install/lib/ Native export hadoop_opts= "-djava.library.path= $HADOOP _install/lib" #HADOOP VARIABLES END
Ubantu the JDK path can be obtained using the following command:
Update-alternatives--config Java
2. Execute the following command to make the change effective:
SOURCE ~/.BASHRC
3. Modify the configuration of the hadoop-env.sh:
sudo gedit/usr/local/hadoop/etc/hadoop/hadoop-env.sh
Find Java_home instead of:
/usr/lib/jvm/java-1.7.0-openjdk-amd64
Then add a line of code below:
Export hadoop_opts= "-djava.library.path= $HADOOP _prefix/lib: $HADOOP _prefix/lib/native"
Second: pseudo-distributed:
1.sudo Gedit/usr/local/hadoop/etc/hadoop/core-site.xml
<configuration> <property> <name>hadoop.tmp.dir</name> <value>/home/sendi /hadoop-2.6.0/tmp</value> <description>abase for other temporary directories.</description> </property> <property> <name>fs.defaultFS</name> <value>hdfs:/ /localhost:9000</value> </property></configuration>
2.sudo GEDIT/USR/LOCAL/HADOOP/ETC/HADOOP/MAPRED-SITE.XM
<configuration> <property> <name>mapred.job.tracker</name> <value> localhost:9001</value> </property> </configuration>
3.sudo Gedit/usr/local/hadoop/etc/hadoop/yarn-site.xml
<configuration> <property> <name>mapreduce.framework.name</name> <value >yarn</value> </property> <property> <name>yarn.nodemanager.aux-services </name> <value>mapreduce_shuffle</value> </property>
4.sudo Gedit/usr/local/hadoop/etc/hadoop/hdfs-site.xml
<configuration> <property> <name>dfs.replication</name> <value>1< /value> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:/home/sendi/hadoop-2.6.0/dfs/name</value> </property> </configuration >
5.sudo gedit/usr/local/hadoop/etc/hadoop/masters 添加:localhost
6.sudo gedit/usr/local/hadoop/etc/hadoop/slaves
添加:localhost
7. Initializing file system HDFs
Bin/hdfs Namenode-format
8. Start
Sbin/start-dfs.sh sbin/start-yarn.sh
--------------------------------------------------------------------------------------------------------------- ----------------------------
[Email protected]:~/hadoop-2.6.0$ sbin/start-dfs.sh
Starting namenodes on [localhost]
Localhost:starting Namenode, logging to/home/sendi/hadoop-2.6.0/logs/hadoop-sendi-namenode-sendijia.out
Localhost:starting Datanode, logging to/home/sendi/hadoop-2.6.0/logs/hadoop-sendi-datanode-sendijia.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0:starting Secondarynamenode, logging to/home/sendi/hadoop-2.6.0/logs/ Hadoop-sendi-secondarynamenode-sendijia.out
[Email protected]:~/hadoop-2.6.0$ sbin/start-yarn.sh
Starting Yarn Daemons
Starting ResourceManager, logging to/home/sendi/hadoop-2.6.0/logs/yarn-sendi-resourcemanager-sendijia.out
Localhost:starting NodeManager, logging to/home/sendi/hadoop-2.6.0/logs/yarn-sendi-nodemanager-sendijia.out
Ubuntu Installation hadoop2.6