Holiday idle come to nothing, on their own set up an environment, including: Tools: Ssh,vim Environment: Jdk,hadoop in this record, the next use 1. Tool class
The two commonly used tools for SSH and Vim are two commands:
Vim command: sudo apt-get install VIM-GTK
SSH command: sudo apt-get install Openssh-server
2. Environment 2.1 JDK Installation
The JDK installation only needs to be decompressed to be able to use (. tar.gz), but you need to insert the following code in the/etc/profile file:
1 Export JAVA_HOME=/OPT/JAVA/JDK 2 Export Jre_home=${java_home}/jre 3 Export Classpath=.:${java_home}/lib:${jre_home}/lib 4 Export Hadoop_home=/opt/java/hadoop 5 Export path=.: $HADOOP _home/bin: $JAVA _home/bin: $PATH
Code2.2 Hadoop Installation
Hadoop installation can be used only after decompression (. tar.gz)
Switch to the ${hadoop_home}/etc/hadoop/folder
1. Modify the Core-site.xml file
1 <Configuration>2 < Property>3 <name>Hadoop.tmp.dir</name>4 <value>File:/opt/java/hadoop/tmp</value>5 <Description>Abase for other temporary directories.</Description>6 </ Property>7 < Property>8 <name>Fs.defaultfs</name>9 <value>hdfs://localhost:9000</value>Ten </ Property> One </Configuration>
Code
2. Modify the Hdfs-site.xml file
1 <Configuration>2 < Property>3 <name>Dfs.replication</name>4 <value>1</value>5 </ Property>6 < Property>7 <name>Dfs.namenode.name.dir</name>8 <value>File:/opt/java/hadoop/tmp/dfs/name</value>9 </ Property>Ten < Property> One <name>Dfs.datanode.data.dir</name> A <value>File:/opt/java/hadoop/tmp/dfs/data</value> - </ Property> - </Configuration>
Code
3. Format the system:
./bin/hdfs Namenode-format
4. Start, stop the service code:
./sbin/start-dfs.sh
./sbin/stop-dfs.sh
5.JPS Check whether the service is started
Environment Construction: JDK--SSH--VIM--HADOOP--SYBASEIQ