1. Hadoop starts or stops
1 ) The first way
Start separately HDFS and the MapReduce , the command is as follows:
Start:
$ start-dfs.sh
$ start-mapred.sh
Stop it:
$ stop-mapred.sh
$ start-dfs.sh
2 ) Second Way
Start all or stop all
Start: start-all.sh
Boot order: NameNode , DataNode , Secondarynamenode , Jobtracker , Tasktracker
Stop: stop-all.sh
Stop Order: Jobtracker , Tasktracker , NameNode , DataNode , Secondarynamenode
3 ) The Third way of starting
Each daemon is started one by one and the boot sequence is as follows:
NameNode , DataNode , Secondarynamenode , Jobtracker , Tasktracker
The command is as follows:
Start:
hadoop-daemon.sh Start Namenode
hadoop-daemon.sh Start Datanode
hadoop-daemon.sh Start Secondarynamenode
hadoop-daemon.sh Start Jobtracker
hadoop-daemon.sh Start Tasktracker
Stop it:
hadoop-daemon.sh Stop Jobtracker
hadoop-daemon.sh Stop Tasktracker
hadoop-daemon.sh Stop Namenode
hadoop-daemon.sh Stop Datanode
hadoop-daemon.sh Stop Secondarynamenode
2 , analysis start Shell Script
1 ) View start-all.sh Script:
1th:, this Shell script, executed only on the primary node.
2nd: Start First DFS file System daemon, start again MapReduce The daemon of the framework
3rd: Start HDFS when the file system daemon is called, the Start-dfs.shshell To start a script; MapReduce Daemon, call the Start-mapred.shshell script.
2 ) View start-dfs.sh Script:
1th: This script runs on DFS on the primary node of the file system.
2nd: If you start first DataNode daemon, no boot in progress NameNode Daemon before the process, DataNode log file is always connected NameNode error message.
3rd: Start HDFS Order of Daemons
4th: NameNode start, call is hadoop-daemon.sh scripts;
DataNode and the Secondarynamenode the start call is hadoop-daemons.sh script.
Three ways to start or stop 09_hadoop and launch scripts