The Linux version I am using here is the CentOS 6.4 centos-6.4-i386-bin-dvd1.iso:http://mirrors.aliyun.com/centos/6.8/isos/i386/0. Using host- The only way to change the virtual network card on Windows to the same network segment as the network card on Linux Note: Be sure to windowsh WMnet1 IP settings and your virtual machine in the same network segment, but the IP is not the same, the previous work: 1. Modify the Linux IP hand Changes can also be ordered to modify vim/etc/sysconfig/network-scripts/ifcfg-eth02. Change the hostname (note the Ubuntu version mode) vim/etc/sysconfig/networkChange the previous name to itcast013. Modify the hostname and IP corresponding relationship vim/etc/hosts 192.168.8.88 itcast01 4. Turn off the firewall View the fence status service iptables status Close service iptables Stop View firewall boot status chkconfig iptables--list shutdown boot chkconfig iptables off II, installing Java jdk & nbsp Here is jdk-7u60-linux-i586.tar.gz, where I use the Vmware--> shared folder, (need to install) The VMware Tools tool, so that we can use the shared folder mode of the file under Windows , shared to the Linux platform. Share in/mnt/hdfs/ mkdir/usr/java tar-zxvf jdk-7u60-linux-i586.tar.gz- c/usr/java Adding Java to environment variables vim/etc/profile Add the following at the end of the file export java_home=/usr/java/jdk1.7.0_60 export path= $PATH: $JAVA _home/bin Refresh configuration source/etc/profile third, install Hadoop download hadoophttps://archive.apache.org/dist/ https://archive.apache.org/dist/hadoop/core/hadoop-2.2.0/This download is: hadoop-2.2.0.tar.gz 1. Upload the Hadoop package, I use FileZilla to upload to the root directory under Linux 2. Extract Hadoop packages First create a/itcast directory at the root directory mkdir/itcast tar-zxvf Hadoop-2.2.0.tar.gz-c/itcast 3. Configuring Hadoop Pseudo-distributed (to modify 4 files under etc/) first:hadoop-env.sh vim hadoop-env.sh export java_home=/usr/java/jdk1.7.0_60 second one: core-site.xml<configuration > <!--the address of the Boss (NameNode) used to designate HDFs--> <PROPERTY> ; <name>fs.defaultFS</name> <value>hdfs://itcast01:9000</value> ≪/property> <!--used to specify the directory where the Hadoop runtime generates files--> <PR operty> <name>hadoop.tmp.dir</name> & nbsp <value>/itcast/hadoop-2.2.0/tmp</value> </ property></configuration> Third:hdfs-site.xml <configuration> <!--Specify the number of HDFs save data copies--> < property> <name>dfs.replication</name> <value>1</value> </PROPERTY> ;</configuration> Fourth: Mapred-site.xml (need to copy mapred-site.xml.template from this file) < configuration> <!--tellHadoop after Mr runs on yarn--> <property> &NBS P <name>mapreduce.framework.name</name> <value> yarn</value> </property> </configuration> Fifth:yarn-site.xml <configuration> <!--NodeManager the way to get data is shuffle-- > <property> <NAME>YARN.N odemanager.aux-services</name> <value>mapreduce_ shuffle</value> </property> <!-- Specify yarn's boss (ResourceManager) address--> <property> <name>yarn.resourcemanager.hostname</name> <value>itcast01</value> &NBS P </property></configuration> 4. Adding Hadoop to an environment variable Vim/etc/profileexport java_home=/usr/java/ Jdk1.7.0_60export hadoop_home=/itcast/hadoop-2.2.0export path= $PATH: $JAVA _home/bin: $HADOOP _home/bin #刷新配置 SOURCE/ETC/PROFILE&N Bsp;5. Initialize HDFS (format file system, this step is similar to just buy a USB flash drive needs to be formatted) #hadoop Namenode-format (outdated ) hdfs namenode-format 6. Launcher HDFs and yarn ./start-all.sh (OBSOLETE) This script is Deprecated. Instead use start-dfs.sh and start-yarn.shstarting namenodes on [it] #有个小问题 (password required multiple times) & nbsp Next, use JPS to view the process conditions &NBSP;JPS (JPS Some simple cases of the current Java process on the Linux/unix platform), and if you have the following process, the test passes 4334 NodeManager3720 NameNode4060 ResourceManager3806 DataNode4414 jps In addition, we can view it in the browser under the Windows platform, whether to build success http://192.168.8.88:50070 (HDFs management interface) http://192.168.8.88:8088 (Yarn management interface) ADD linux hostname and IP mappings in this file c:\Windows\System32\drivers\etc At the end, add 192.168.8.88 itcast01
Hadoop downloads, installs, configures under Linux platforms