[Error] failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (COMPILE-PROTOC) on project hadoop-common: org.apache.maven.plugin.mojoexecutionexception: protoc version is ' libprotoc 2.4.1 ', expected version is ' 2.5.0 ' -> [help 1][error][error] to see the full stack trace of the errors, re-run Maven with The -e switch. [error] re-run maven using the -x switch to enable full Debug logging. [ERROR] [error] for more information about the errors and possible solutions, please read the following articles:[error] [help 1] http:// Cwiki.apache.org/confluence/display/maven/mojoexecutionexception[error][error] after correcting the problems, you can resume the build with the command[error] Mvn <goals> -rf :hadoop-common
Installing PROTOC
wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
(Download https://code.google.com/p/protobuf/downloads/list here)
Unzip, go to the root directory to execute sudo./configure--prefix=/usr
If the installation error:
Cpp:error trying to exec ' cc1plus ': execvp:no such file or directory
Install the g++
sudo apt-get install g++
sudo make
sudo make check
sudo make install
Protoc--version
Encountered protoc:error while loading shared libraries:libprotoc.so.8:cannot open Shared object file:no such file or Directo Ry, such as the Ubuntu system, is installed by default under/usr/local/lib, you need to specify/usr. sudo./configure--prefix=/usr must be added--proix parameters, recompile and install.
[error] failed to execute goal org.apache.maven.plugins:maven-antrun- plugin:1.6:run (make) on project hadoop-common: An Ant buildexception has occured: execute failed: java.io.ioexception: cannot run program "CMake" (in directory "/home/wyf/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop- Common/target/native "): java.io.ioexception: error=2, no such file or directory -> [help 1] [error] [error] to see the full stack trace of the errors, re-run Maven with the -e switch. [error] re-run maven using the -x switch to enable full debug logging. [error] [error] for more information about the errors and possible solutions, please read the following articles: [error] [help 1] http:// Cwiki.apache.org/confluence/display/maven/mojoexecutionexception
Installing CMake
sudo apt-get install CMake
error] failed to execute goal org.codehaus.mojo.jspc:jspc-maven-plugin:2.0- alpha-3:compile (HDFS) on project hadoop-hdfs: execution hdfs of goal org.codehaus.mojo.jspc:jspc-maven-plugin:2.0-alpha-3:compile failed: Plugin org.codehaus.mojo.jspc:jspc-maven-plugin:2.0-alpha-3 or one of its dependencies could not be resolved: could not transfer artifact ant:ant:jar:1.6.5 from/to central (http://repo.maven.apache.org/maven2): get request of: ant /ant/1.6.5/ant-1.6.5.jar from central failed: read timed out -> [ help 1][error][error] to see the full stack trace of the Errors, re-run maven with the -e switch. [error] re-run maven using the -x switch to enable full debug logging. [ERROR] [error] for more information about the errors and possible solutions, please read the following articles:[error] [help 1] http:// Cwiki.apache.org/confluence/display/maven/pluginresolutionexception[error][error] after correcting the problems, you can resume the build with the command[ Error] mvn <goals> -rf :hadoop-hdfs
Installing Ant
1. First download the Ant
Baidu Network disk: apache-ant-1.9.4-bin.tar.gz
Http://pan.baidu.com/s/1c0vjhBy
or link below:
Apache-ant-1.9.4-bin.tar.gz
2. Unzip
Tar zxvf apache-ant-1.9.4-bin.tar.gz
3. Configure Environment variables
Vim ~/.BASHRC
Export ant_home=/home/xxl/apache-ant-1.9.4
Export path= $ANT _home: $PATH
SOURCE ~/.BASHRC
[Error] failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.4.0:prot oc (COMPILE-PROTOC) on project hadoop-common: org.apache.maven.plugin.mojoexecut ionexception: ' Protoc --version ' did not return a version -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run maven with the -e swit ch. [ERROR] Re-run Maven using the -X switch to enable Full debug logging. [error] [error] for more information about the errors and possible solutions, please rea d the follOwing articles: [error] [help 1] http://cwiki.apache.org/confluence /display/maven/mojoexecutione xception [error] [ERROR] After correcting the problems, you can resume the Build with the command [error] mvn <goals > -rf :hadoop-common
Protobuf version too low
Install version 2.5.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile) on project Hadoop-snappy:an Ant Bu Ildexception have occured:the following error occurred while executing this line:[error]/home/ngc/char/snap/hadoop-snapp Y/hadoop-snappy-read-only/maven/build-compilenative.xml:75:exec Returned:2
The reason for this is disgusting, because Hadoop snappy has requirements for the GCC version, because I am the December 2012 ubuntu12.04, so GCC is already 4.6, but in Google Code that saw someone said he from gcc4.6 back into the gcc4.4 is OK, I also a bit, sure enough, this mistake did not.
GCC--version #查看gcc版本
GCC (Ubuntu/linaro 4.4.7-1ubuntu2) 4.6.3
Copyright Free Software Foundation, Inc.
This procedure is free software; Please refer to the copyright notice of the source code. The software does not have any warranties;
Includes no warranties of merchantability and fitness for a particular purpose.
How to return?
1. Apt-get Install gcc-4.4
2. RM/USR/BIN/GCC
3. Ln-s/USR/BIN/GCC-4.4/USR/BIN/GCC
After that, GCC--version, you will find that GCC has become 4.4.7.
. [Exec] /bin/bash ./libtool --tag=cc --mode=link gcc -g -wall -fpic -o2 -m64 -g -o2 -version-info 0:1:0 -l/usr/local//lib -o libhadoopsnappy.la -rpath /usr/local/lib src/org/apache/hadoop/io/compress/snappy/ snappycompressor.lo src/org/apache/hadoop/io/compress/snappy/snappydecompressor.lo -ljvm -ldl [exec] /usr/bin/ld: cannot find -ljvm [exec] collect2: ld returned 1 exit status [exec] make: *** [libhadoopsnappy.la] Error 1 [exec] libtool: link: gcc -shared -fpic -dpic src/org/apache/hadoop/ io/compress/snappy/.libs/snappycompressor.o src/org/apache/hadoop/io/compress/snappy/.libs/ Snappydecompressor.o -l/USR/LOCAL//LIB&NBSP;-LJVM&NBSP;-LDL&NBSP;&NBSP;-O2&NBSP;-M64&NBSP;-O2&NBSP;&NBSP;&NBSP;-WL,-SONAME&NBSP;-WL, libhadoopsnappy.so.0 -o .libs/libhadoopsnappy.so.0.0.1
There are many online blogs that address/usr/bin/ld:cannot find-lxxx, but here, I tell you, none of them apply. Because there is neither a missing thing nor a wrong version, because there is no libjvm.so to install the JVM symbolic link to usr/local/lib. If your system is AMD64, you can go to/root/bin/jdk1.6.0_37/jre/lib/amd64/server/to see libjvm.so link to the place, here are modified as follows:
Ln-s/root/bin/jdk1.6.0_37/jre/lib/amd64/server/libjvm.so/usr/local/lib/problem can be solved.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (
make
) on project hadoop-common: An Ant BuildException has occured:
exec
returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to
enable
full debug logging.
Installing Zlib-devel
Ubuntu installation is
sudo apt-get install Zlib1g-dev
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run [make] on project Hadoop-pipes:an Ant Builde Xception have occured:exec Returned:1[error] around Ant part ... <exec dir= "/home/xxl/hadoop-2.5.2-src/hadoop-tools/ Hadoop-pipes/target/native "executable=" CMake "failonerror=" true "... @ 5:120 in/home/xxl/hadoop-2.5.2-src/ Hadoop-tools/hadoop-pipes/target/antrun/build-main.xml[error] [Help 1]
Installation: sudo apt-get install Libssl-dev
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (TAR) on project Hadoop-dist:an Ant Buildexc Eption have occured:exec Returned:1[error] around Ant part ... <exec dir= "/home/xxl/hadoop-2.5.2-src/hadoop-dist/ Target "executable=" sh "failonerror=" true "... @ 21:96 in/home/xxl/hadoop-2.5.2-src/hadoop-dist/target/antrun/ Build-main.xml
Installation: sudo apt-get install build-essential
sudo apt-get install Libglib2.0-dev
Linux compiles Hadoop into 64-bit common errors