Using Eclipse development Mapreduce__linux in Linux

Source: Internet
Author: User

Reference http://www.powerxing.com/hadoop-build-project-using-eclipse/


Hadoop-eclipse-plugin download address https://github.com/winghc/hadoop2x-eclipse-plugin eclipse download installation directly at Ubuntu Software Center The default installation path for the/usr/lib/eclipse JDK version requires a 1.7 java-version check version, and if not 1.7 the following command sudo apt-get install Openjdk-7-jre openjdk-7-jdk ( Default installation under/USR/LIB/JVM/JAVA-7-OPENJDK-AMD64) Vim ~/.BASHRC Add a row of export path= $PATH:/usr/lib/jvm/java-7-openjdk-amd64/bin Save source ~/.BASHRC in the Eclipse root directory./eclipse-clean Restart eclipse See DFS locations start HADOOP hadoop_home/sbin/start-dfs.sh Then Establish a connection

First step: Select the preference under the Window menu.

Open preference

A form will pop up, the left side of the form will be more than the Hadoop map/reduce option, click this option, select the installation directory of Hadoop (such as/usr/local/hadoop,ubuntu bad selection directory, direct input on the line).

Select the installation directory for Hadoop

Step two: Switch map/reduce working directory, choose Open Perspective-> Other under the Window menu, pop up a form, select the Map/reduce option to switch.

Switch map/reduce working directory

Step three: Establish a connection to the Hadoop cluster, click on the Map/reduce locations panel in the lower right corner of the Eclipse software, right-click in the panel and select New Hadoop Location.

Establish a connection to the Hadoop cluster

Set the Master settings in the Bouncing General Options panel and set the configuration to be consistent with Hadoop, such as the Hadoop pseudo distributed configuration I use, set Fs.defaultfs to hdfs://localhost:9000, then DFS maste The Post of R that should also be changed to 9000.

Location Name to fill in, Map/reduce Master Host to fill in your local IP (localhost also line), Port default is 50020. The final settings are as follows:

Settings for Hadoop Location

Then switch to the Advanced Parameters option panel, which has a detailed configuration, and remember that it needs to be consistent with the configuration of Hadoop (/usr/local/hadoop/etc/hadoop configuration file), as I configured the Hadoop.tmp.dir , you need to make changes.

Settings for Hadoop Location

Finally click Finish,map/reduce Location to create a good.

So the configuration is complete.  problems with running the MapReduce project in EclipseAfter replacing jdk1.8 cannot run MapReduce program, the error can not find Map$entry what: official website again next Eclipse Juno. After decompression, delete the original eclipse and move the new to the past.     Remember to copy the Hadoop-eclipse plugin to eclipse's plugin.  Problem: No DFS locations or preferrence in Hadoop map reduce error 51 reason: JDK in Eclipse and the JDK for compiling Plug-ins: Unified JDK 1.7 Configuring the JDK for Linux and the JDK in Eclipse is 1.7, adding environment variables to the ~/.BASHRC #set Java environment
Export JAVA_HOME=/USR/LIB/JVM/JAVA-7-OPENJDK-AMD64
Export JRE_HOME=/USR/LIB/JVM/JAVA-7-OPENJDK-AMD64/JRE
Export path= $JAVA _home/bin: $JRE _home/bin: $PATH
Export classpath=.: $JAVA _home/lib/dt.jar: $JAVA _home/lib/tools.jar
#set Hadoop Environment
Export Hadoop_home=/usr/local/hadoop
Export path= $HADOOP _home/bin: $HADOOP _home/sbin: $PATH
#set Scala Environment
Export scala_home=/usr/local/scala-2.10.4
Export path= $PATH: $SCALA _home/bin
#set Spark Environment
Export spark_home=/usr/local/spark-1.5.2
Export path= $PATH: Update environment variables after saving $SPARK _home. source. BASHRC Restart the virtual machine again after the virtual machine has a problem: in the terminal open Eclipse/usr/lib/eclipse/eclipse-clean-clean are not, re-copy jar package, re-import

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.