Moving Hadoop the next day. Construction Hadoop The environment also took two days to write the process of its own configuration here, I hope to help!
I will use the text of all the resources are shared here, click to download, do not need to find a!
There is a book on the inside of Hadoop technology. The first chapter describes the configuration process, but not specific ~
---------------Installing the JDK-------------------------------
1.
Download jdk1.6.0_45
2. Unzip to opt directory and configure/etc/profile. At the end of the file add
#set Java Environment
Java_home=/opt/jdk1.6.0_45
Export JRE_HOME=/OPT/JDK1.6.0_45/JRE
Export classpath= $JAVA _home/lib: $JRE _home/lib: $CLASSPATH
Export path= $JAVA _home/bin: $JRE _home/bin: $PATH
3. Then use Source/etc/profile to run the newly changed initialization file (profile) again
4. Configure Default Programs
Update-alternatives--install/usr/bin/java Java/opt/jdk1.6.0_45/bin/java 300
Update-alternatives--install/usr/bin/java Java/opt/jdk1.6.0_45/bin/javac 300
Update-alternatives--install/usr/bin/java Java/opt/jdk1.6.0_45/bin/jar 300
Update-alternatives--install/usr/bin/java Java/opt/jdk1.6.0_45/bin/javah 300
Update-alternatives--install/usr/bin/java JAVA/OPT/JDK1.6.0_45/BIN/JAVAP 300
Then run the following code to select the version number I installed to the JDK:
Update-alternatives--config Java
5. You can then use Java-version to view the Java version number.
---------------Install Eclipse-------------------------------
1. Download Java version to eclipse from official website
Http://mirror.neu.edu.cn/eclipse/technology/epp/downloads/release/kepler/SR2/eclipse-java-kepler-SR2-linux-gtk.tar.gz
2. Unzip to the/home/simon directory
3. Create a shell script with VI named Eclipse
Vi/usr/local/bin/eclipse
The contents are as follows:
/home/simon/eclipse/eclipse
4. Add operational privileges to script eclipse: chmod +x/usr/local/bin/eclipse
5. Enter eclipse directly to launch it.
---------------Install ant-------------------------------
1. Download the ant
Http://mirror.esocc.com/apache//ant/binaries/apache-ant-1.9.4-bin.tar.gz
2. Unzip the copy to the/home/simon directory
3. Change the/etc/profile file
Export ant_home=/home/simon/apache-ant-1.9.4
Export path= $PATH $: $ANT _home/bin
4. Then use Source/etc/profile to run the new change again
5. Enter Ant-version to verify successful installation
Apache Ant (TM) version 1.9.4 compiled on April 29 2014
---------------Installing Hadoop-------------------------------
1. Change the machine name, edit/etc/hostname change to localhost
2. Configure SSH without password login
SSH-KEYGEN-T RSA
CD ~/.ssh
Cat Id_rsa.pub >> Authorized_keys
Apt-get Install Openssh-server
3. If the command ssh localhost is unsuccessful, you need to start the SSH service
Start the SSH service with the following command
Service SSH Start
/etc/init.d/ssh start
Assuming that the startup failed. Then restart it. Work
3. Configure Hadoop
(1) Edit conf/hadoop-env.sh, change the value of Java_home:
Export java_home=/opt/jdk1.6.0_45
(2) Edit Conf/mapred-site.xml, add content:
<property>
<name>mapred.job.tracker</name>
<value>http://localhost:9001</value>
</property>
(3) Edit Conf/hdfs-site.xml, add content:
<property>
<name>dfs.name.dir</name>
<value>/home/simon/name</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>/home/simon/data</value>
</property>
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
(4) Edit Conf/core-site.xml. Add Content:
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/hadoop/hadoop-1.0.0/tmp</value>
</property>
(5)
Formatted Hdfs:bin/hadoop Namenode-format
Start hadoop:bin/start-all.sh
Assuming no permissions are displayed, either the file has no permissions, or the file to the user is not the current user (root)
Ability to try chmod +x file name
Chown Root:root bin/*
-------------------Configuring the Eclipse plug-in---------------
1. Copy the Hadoop-eclipse-plugin-1.0.0.jar to the Eclipse folder under the Plugins folder
2. Open Eclipse
Window-showview-other ... dialog box, select MapReduce tools-map/reduce Locations
Assume that the dialog box does not. Then:%eclispe_dir%/configration/config.ini file, found inside there is a org.eclipse.update.reconcile=false configuration, changed to True after another into eclipse
3. You can see DFS locations in Project Explorer, assuming you can open a few directories down to indicate that the configuration was successful
Start Eclipse:
Env ubuntu_menuproxy=/home/simon/eclipse/eclipse start eclipse. Note there is a space between the equals sign and the eclipse path
------------------Executing Java programs--------------------
1. Configure input/output to path
In the program, right-click--run as--run Configurations. --argument
Fill in
Hdfs://localhost:9000/test/input Hdfs://localhost:9000/test/output
Intermediate spacing between input and output to path
2. Import Hadoop into the jar package, right-click on the project--properties--left to select Java Build path--Right-click on the right side of the libraries--to add External JARs ...
Select the jar package in the hadoop/lib/path. Suppose you don't know which one to choose. Then choose it all! ~ (helpless)
3.in the program, right-click--run as--Run on Hadoop execution
Copyright notice: This article Bo Master original articles, blogs, without consent may not be reproduced.
Ubuntu 14.04 Hadoop Eclipse 0 Configuration Basic Environment