Scenario Description: Host Mac Pro, two virtual machines installed, virtual machines are Ubuntu systems
Ubuntu System Configuration JDK
1, to the Sun's official website download
http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html
2. Unzip the downloaded file
[Email protected]:~/software$ tar-zxvf jdk-7u79-linux-x64.tar.gz
3. Configuring Java Environment variables
Jump to end of file
Add to
Export java_home=/home/lixiaojiao/software/jdk1.7. 0_79export class_path=.: $CLASS _path: $JAVA _home/ Libexport PATH=.: $PATH: $JAVA _home/bin
Save and exit
4, after the configuration is not immediately effective, need to use the following command after the effective
[Email protected]:~/software$ Source ~/.BASHRC
5, verify whether the configuration is successful, the following results proved successful
[Email protected]:~/software$ java-version
Mac machine configuration hadoop2.6.1 Environment (Java environment has been configured before)
1. Unzip the Hadoop download file
2. I put cloud-related into the directory cloudcomputing, view the directory structure
3. Setting up SSH telnet
Lixiaojiaodemacbook-pro:sbin lixiaojiao$ ssh-keygen-t rsa-p ""
Execute the following command
Lixiaojiaodemacbook-pro:sbin lixiaojiao$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
Verify success
Lixiaojiaodemacbook-pro:sbin lixiaojiao$ ssh localhost
appear, fail.
The reason is that SSH telnet to the system is not turned on
Log on remotely remotely with System preferences, sharing
Execute the command again
Lixiaojiaodemacbook-pro:sbin lixiaojiao$ ssh localhost
appear to prove successful
3. Switch to the ETC directory to view the configuration file
Lixiaojiaodemacbook-pro:cloudcomputing lixiaojiao$ CD hadoop-2.6.1/etc/hadoop/
4. Modify the configuration file
Switch to the/users/lixiaojiao/software/cloudcomputing/hadoop-2.6.1/etc/hadoop directory
(1) Configuration Core-site.xml
Added in
<configuration>
</configuration> in the middle add the following configuration
<property> <name>fs. Default.name</name> <value>hdfs://localhost:9000</value></ Property>
(2) Configuration Yarn-site.xml
Add the following configuration
<property> <name>yarn.noCHdemanager.aux-services</name> <value>mapreduce_ Shuffle</value></property><property> <name> Yarn.nodemanager.aux-services.mapreduce.shuffle. class</name> <value>org.apache.hadoop.mapred.shufflehandler</value></ Property>
(3) Create and configure the Mapred-site.xml, copy the mapred-site.xml.template in the directory to Mapred-site.xml and add the configuration as follows
<property> <name>mapreduce.framework.name</name> <value>yarn</value>< /property>
(4) Configure Hdfs-site.xml, first in
/USERS/LIXIAOJIAO/SOFTWARE/CLOUDCOMPUTING/HADOOP-2.6.1/new directories, Hdfs/data and Hdfs/name, and add the following configuration
<property> <name>dfs.replication</name> <value>1</value></property ><property> <name>dfs.namenode.name.dir</name> <value>file:/users/ Lixiaojiao/software/cloudcomputing/hadoop-2.6.1/hdfs/name</value></property><property> <name>dfs.datanode.data.dir</name> <value>file:/users/lixiaojiao/software/cloudcomputing /hadoop-2.6.1/hdfs/data</value></property>
(5) Format HDFs
Lixiaojiaodemacbook-pro:bin lixiaojiao$./hdfs Namenode-format
Appear
(6) Start Hadoop
Switch to the Sbin directory
Lixiaojiaodemacbook-pro:bin lixiaojiao$ CD. /sbin/
Perform
Perform
Browser opens http://localhost:50070/, you will see the HDFs administration page
Browser opens http://localhost:8088/, you will see the Hadoop Process Management page
Running in the sixth part
When the following prompt appears
computing/hadoop-2.6.1/logs/hadoop-lixiaojiao-secondarynamenode-lixiaojiaodemacbook-Pro.local.out 2015-10-18 10:08:43.887 java[1871:37357] Unable to load realms info from Scdynamicstorenativefor
your platform ... using builtin-java classes where Applicablelixiaojiaodemacbook
The reason is in the official Lib directory. So file is compiled under 32-bit system, if it is but my Mac machine is 64-bit system, the need to download the source code on the 64-bit recompile, because I downloaded the source code has been trying for a long time did not succeed, finally gave up, downloaded the cattle compiled 64-bit package, address bit HTTP ://yun.baidu.com/s/1c0rfioo#dir/path=%252fbuilder, and download this normal 32-bit Hadoop package, http://www.aboutyun.com/ thread-6658-1-1.html, after the download succeeds, the native in the downloaded 64-bit build file is overwritten with the native file in the Lib directory and is configured again according to the above section.
The following issue occurs when you re-execute to the command above:
Lixiaojiaodemacbook-pro:sbin lixiaojiao$./start-all.sh This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh2015-10-19 21:18:29.414 java[5782:72819] Unable to load realm inf O from Scdynamicstore
According to the practice of online cattle or not, and finally only change the JDK, reinstall JDK
#export java_home=/library/java/javavirtualmachines/jdk1.7.0_25.jdk/contents/homeexport JAVA_HOME=/ library/java/javavirtualmachines/jdk1.7.0_79.jdk/contents/home/
Verify success
Create input Directory
lixiaojiaodemacbook-pro:hadoop-2.2.0 lixiaojiao$ Hadoop fs-mkdir-p input
Uploading local files to the HDFs file system
Lixiaojiaodemacbook-pro:cloudcomputing lixiaojiao$ Hadoop fs-copyfromlocal README.txt input
The following new issue appears
Change the IP address in fs.default.name to 127.0.0.1.
Switch to the Share/hadoop/mapreduce directory to execute the following statement
Hadoop jar Hadoop-mapreduce-examples-2.2.0.jar wordcount Input Output
Execute the following command to see if the output directory is generated
Hadoop Fs-ls
Execute the following command to see the results
Hadoop Fs-cat
Therefore, in order to facilitate the direct deployment of the configured Hadoop files to other Ubuntu system, directly using the SCP command, the Ubuntu system needs to open SSH
lixiaojiaodemacbook-pro:cloudcomputing lixiaojiao$ scp-r hadoop-2.2.0 [email protected]:/home/lixiaojiao/software
lixiaojiaodemacbook-pro:cloudcomputing lixiaojiao$ scp-r hadoop-2.2.0 [email protected]:/home/lixiaojiao/software
Then verify the success by executing the WordCount program
hadoop2.2 Environment Configuration