Centos 6.5 pseudo-distribution Installation
Jdk-6u24-linux-i586.bin 、hadoop-1.2.1.tar.gz, hadoop-eclipse-plugin-1.2.1.jar,
Eclipse-jee-indigo-SR2-linux-gtk.tar.gz
Assume that all software packages are stored in the/home/hadoop folder.
1. JDK Installation
1.1 root user login, use the command mkdir/usr/local/program new directory program, used to store JDK, this textbook uses JDK version jdk-6u24-linux-i586.bin.
1.2 (for example, jdk in the hadoop directory in advance, you can use the command CP/home/hadoop/jdk-6u24-linux-i586.bin/usr/local/Program/for replication ).
1.3 decompress JDK
Go to the/usr/local/Program/directory and unzip it with the command./jdk-6u24-linux-i586.bin.
Decompressed successfully:
The registration page appears. You can skip this step:
1.4 you can delete the JDK bin package by running the following command:
Rm-RF jdk-6u24-linux-i586.bin.
1.5 configure JDK
Log on as the root user and run the command Vim/etc/profile (the file/etc/profile is very important and will be used later in hadoop configuration ). Press the I key to insert.
Add the following content after:
# Set javaenvironment
Exportjava_home =/usr/local/Program/jdk1.6.0 _ 24
Exportjre_home =/usr/local/Program/jdk1.6.0 _ 24/JRE
Exportclasspath =.: $ java_home/lib: $ java_home/JRE/lib
Exportpath = $ java_home/bin: $ java_home/JRE/bin: $ path
Press ESC + (SHIFT +;) + WQ to save and exit.
After exiting, enter source/etc/profile to make the configuration take effect.
After the configuration is complete, enter Java-version in the command line to check whether the configuration is successful.
2. Ssh password-free Authentication
2.1 root permission, enter the rpm-Qa | grep OpenSSH command to check whether SSH and rsync are installed.
2.2 generate a password-free key pair
Use the ssh-kaygen-t rsa-p' command
This is not the first verification. This is the second one, so there is overwrite, but it doesn't matter. I just want to demonstrate it.
2.3 append id_rsa.pub to the authorization key
Command cat ~ /. Ssh/id_rsa.pub> ~ /. Ssh/authorized_keys
2.4 verify that the configuration is successful
Enter the SSH localhost command to check whether you can log on without a password.
You do not need to enter a password. You only need to enter yes to complete the configuration.
3. hadoop Installation
3.1 copy hadoop to the OPT directory. If hadoop is in the hadoop directory, run the command to copy CP/home/hadoop/hadoop-1.2.1.tar.gz/usr/local/
3.2 enter the hadoop directory and decompress hadoop.
CD/usr/local/tar-zxvf hadoop-1.2.1.tar.gz
Decompress:
3.3 configure Environment Variables
Command Vim/etc/profile
Input # Set hadoop
Export hadoop_home =/usr/local/hadoop-1.2.1
Export Path = $ path: $ hadoop_home/bin
Exit and enter source/etc/profile to make the configuration file take effect.
3.4 configure the hadoop configuration file
Command CD/usr/local/hadoop-1.2.1/conf to enter the conf directory
3.4.1 configure the hadoop-env.sh File
Open file command Vim hadoop-env.sh
Add # setjava Environment
Export java_home =/usr/local/Program/jdk1.6.0 _ 24
Save and exit after editing.
The configuration of the following three files is very important !!!
3.4.2 configure the core-site.xml File
<? Xmlversion = "1.0"?>
<? XML-stylesheettype = "text/XSL" href = "configuration. XSL"?>
<! -- Put site-specific property overrides in this file. -->
<Configuration>
<Property>
<Name> fs. Default. Name </Name>
<Value> HDFS: // localhost: 9000/</value> Note: there must be no less "/" after 9000.
</Property>
<Property>
<Name> hadoop. tmp. dir </Name>
<Value>/usr/local/hadoop-1.2.1/hadooptmp </value>
</Property>
</Configuration>
3.4.3 configure the hdfs-site.xml File
<? Xmlversion = "1.0"?>
<? XML-stylesheettype = "text/XSL" href = "configuration. XSL"?>
<! -- Put site-specific property overrides in this file. -->
<Configuration>
<Property>
<Name> DFS. Replication </Name>
<Value> 1 </value>
</Property>
<Property>
<Name> DFS. Permissions </Name>
<Value> false </value>
</Property>
</Configuration>
3.4.4 configure the mapred-site.xml File
<? Xmlversion = "1.0"?>
<? XML-stylesheettype = "text/XSL" href = "configuration. XSL"?>
<! -- Put site-specific property overrides in this file. -->
<Configuration>
<Property>
<Name> mapred. Job. Tracker </Name>
<Value> localhost: 9001 </value>
</Property>
</Configuration>
Be sure to configure the above three files. Do not write an error !!!
3.4.5 configure the Masters file and slaves File
[[Email protected] # Vim masters
Localhost
[[Email protected] # Vim slaves
Localhost
Note: In pseudo-distribution mode, the namenode serving as the master is the same as the datanode serving as the slave.
So the IP address in the configuration file is the same.
3.4.6 host name and IP resolution settings (this step is very important)
Command Vim/etc/hosts
3.4.7 edit Host Name
Command Vim/etc/hostname
Vim/etc/sysconfig/Network
4. Start hadoop
Command CD/usr/local/hadoop-1.2.1/bin to enter bin directory
Format the hadoop namenode-format command first.
Start command start-all.sh
View JPs
5. Install eclipse
5.1 copy eclipse to the OPT folder
Run CP/home/hadoop/ecipse-jee-indigo-SR2-linux-gtk.tar.gz/OPT
5.2 decompress eclipse
Decompressed
5.3 insert plug-in hadoop-eclipse-plugin-1.2.1.jar
Command CP/home/hadoop/hadoop-eclipse-plugin-1.2.1.jar/opt/Eclipse/plugins
5.4 start eclipse
Configure eclipse
Create DFS location
Check whether the configuration is incorrect.
Create a project
Run code
Centos 6.5 pseudo-distribution Installation