First to download the Hadoop package, version Select 1.2.1, for: HTTP://MIRRORS.CNNIC.CN/APACHE/HADOOP/COMMON/HADOOP-1.2.1/
Here can download hadoop-1.2.1-tar.gz including the source code or hadoop-1.2.1-bin.tar.gz execution package does not include the source code, two file size is almost exactly one-fold relationship
After downloading through TAR-XZVF hadoop-1.2.1.tar.gz decompression, extracted and placed in our designated directory, and then into the Conf directory to start the configuration file
The main is to configure hadoop-env.sh, Core-site.xml, Hdfs-site.xml, mapred-site.xml four files, using VIM to open the configuration
First, hadoop.env.sh.
Java_home this place to remove the note, replace it with the JDK directory, save the exit
Then Core-site.xml, add the code directly between the <configuration> tags
The first attribute is the working directory of Hadoop, the/hadoop, the second is the Dfs File system node directory, and the third is the local server port number is 9000
Hdfs-site.xml
1 <Configuration>2 < Property>3 <name>Dfs.replication</name>4 <value>1</value>5 </ Property>6 </Configuration>
Mapred-site.xml
1 <Configuration>2 < Property>3 <name>Mapred.job.tracker</name>4 <value>localhost:9001</value>5 </ Property>6 </Configuration>
After saving, you can add Hadoop to the environment variable or not add it,
Then click on the previously said configuration SSH free Login
Finally, the process of bin/start-all.sh starting Hadoop under Hadoop
At this time access http://localhost:50030 or http://localhost:50070 can open the page normally proves the configuration is successful
Then enter the command JPS return to see the open process and port, the configuration of Hadoop stand-alone mode is successful.
Configuring Hadoop under Linux