Source code build Hadoop-2.2 version

Source: Internet
Author: User
Tags install openssl php and mysql

Source code compilation hadoop hadoop-2.2.0

When I first came into contact with hadoop, I encountered a big problem, as shown below:

Warn org. Apache. hadoop. util. nativecodeloader: Unable to load native-hadoop library for your platform... Using builtin-Java classes where applicable

The warning information was not very important at the time, because some warning information may appear during compilation of PHP and MySQL, but the warning information is not fatal, this is mainly because the hadoop official website only provides 32-bit compilation. If it runs on a 64-bit platform, problems may occur. The Apache website does not provide 64-bit Compiling. You need to download the hadoop source code and then compile it on the 64-bit platform to run it on 64-bit.


Nagging:

Summarize the experiences of compiling hadoop over the past few days! When I first compiled hadoop, I went online to search for some hadoop compilation documents. I felt that there was no difference with compiling other software, but at the end of the process, the big thing went on one after another, the compilation reports are varied, and I was confused. After three consecutive nights of compilation, I basically got the errors over two o'clock in the morning. Then I summarized them and got on the right track step by step. Until yesterday afternoon, I went to the Internet to find information and compiled it three times. Fortunately, no error was reported during compilation. No lib library file exists in the compiled tar package. It makes my head bigger. Finally, Mr. Tian helped me analyze the problem and found the result because the environmental variable problem was solved. Basically, I spent five days compiling hadoop! What a bloody lesson! Time


All the packages required for this blog are packaged. You can download and compile the package.

Address: http://yunpan.cn/Qa56dyuuKEZid access password b019

After compiling hadoop on the 64-bit platform, you can download and use it directly when you are doing experiments.

Address: http://yunpan.cn/Qa5kMFGmNMY9H access password 6090


Deployment environment:

System: centos 6.4 64bit

Hadoop version: hadoop-2.2.0-src.tar.gz

JDK: jdk-7u60-linux-x64.tar.gz

1. Install the dependency package Yum-y install lzo-devel zlib-devel GCC Autoconf automake libtool gcc-C ++ Yum-y install OpenSSL-devel ncurses-devel2, compile hadoop dependent on the following environment ant Maven protocolbuffer findbugs cmake 3. Compile and install protobuf [[email protected] ~] # Tar xf protobuf-2.5.0.tar.gz [[email protected] ~] # Cd protobuf-2.5.0 [[email protected] protobuf-2.5.0] #. /configure -- prefix =/usr/local/protobuf [[email protected] protobuf-2.5.0] # Make & make install4, compile and install cmake [[email protected] ~] # Tar xf cmake-2.8.12.tar.gz [[email protected] ~] # Cd cmake-2.8.12 [[email protected] cmake-2.8.12] #./Bootstrap [[email protected] protobuf-2.5.0] # Make & make install5, ant install [[email protected] ~] # Tar xf apache-ant-1.9.4-bin.tar.gz-C/usr/local/6, Maven installation [[email protected] ~] # Tar xf apache-maven-3.0.5-bin.tar.gz-C/usr/local/7, findbugs installation [[email protected] ~] # Tar xf findbugs-2.0.2.tar.gz-C/usr/local/8, install JDK [[email protected] ~] # Tar xf jdk-7u60-linux-x64.tar.gz-C/usr/local/9, environment variable # javaexport java_home =/usr/local/jdk1.7.0 _ 60 export jre_home =/usr/Java/jdk1.7.0 _ 60/ JRE export classpath =.: $ classpath: $ java_home/lib: $ jre_home/lib export Path = $ path: $ java_home/bin: $ jre_home/bin # Maven export maven_home =/usr/local/apache-maven-3.0.5export maven_opts = "-xms256m-xmx512m" Export classpath =.: $ classpath: $ maven_home/lib export Path = $ P Ath: $ maven_home/bin # protobuf export protobuf_home =/usr/local/protobuf export classpath =.: $ classpath: $ protobuf_home/lib export Path = $ path: $ protobuf_home/bin # ant export ant_home =/usr/local/apache-ant-1.9.4export classpath =.: $ classpath: $ ant_home/lib export Path = $ path: $ ant_home/bin # findbugs export findbugs_home =/usr/local/findbugs-2.0.2export classpath =.: $ classpath: $ findbugs_home/lib export Path = $ Pa TH: $ findbugs_home/bin # execute environment variables, make the new configuration take effect immediately source/etc/profile 10. Fix the bug. Currently, the Code extracted from the source code package of hadoop version 2.2.0 requires a patch before compilation. Otherwise, an error occurred while compiling hadoop-auth. Solution version: [[email protected] ~] # Cd hadoop-2.2.0-src [[email protected] hadoop-2.2.0-src] # Vim hadoop-common-Project/hadoop-auth/POM. XML <dependency> <groupid> Org. mortbay. jetty </groupid> <artifactid> jetty-util </artifactid> <scope> test </scope> </dependency> <groupid> Org. mortbay. jetty </groupid> <artifactid> jetty </artifactid> <scope> test </scope> </dependency> 11. test whether the environment variable for software installation is successful Java-versionmvn-versionfindbugs -versionprotoc -- Version12: recompile [this takes a long time, about half an hour] {First error: it is because of a software bug; jetty-util second error: this is because the environment editing has not taken effect or the software has not been installed successfully.} [[email protected] ~] # Cd hadoop-2.2.0-src [[email protected] hadoop-2.2.0-src] # MVN clean package-dskiptests-pdist, native, docs-dtar>/root/2.txt 13. Compiled 64-bit hadoop package [email protected] ~] # Cd hadoop-2.2.0-src/hadoop-Dist/target/[[email protected] Target] # ll total usage limit 568drwxr-xr-x 2 root Root 4096 August 22 20:30 antrun-RW-r -- 1 Root 1618 august 22 20:30 dist-layout-stitching.sh-rw-r -- r -- 1 Root 635 August 22 20:30 dist-tar-stitching.shdrwxr-xr-x 9 Root 4096 August 22 20:30 hadoop-2.2.0-rw-r -- r -- 1 Root 129814908 August 22 20:31 hadoop-2.2.0.tar.gz # copy this package to run on a 64-bit Platform -RW-r -- 1 Root Root 2747 August 22 20:30 hadoop-dist-2.2.0.jar-rw-r -- r -- 1 Root 264969498 August 22 20:31 hadoop-dist-2.2.0-javadoc.jardrwxr-xr-x 2 root Root 4096 August 22 20:31 javadoc-bundle-optionsdrwxr-XR-x 2 root Root 4096 August 22 20:30 Maven-archiverdrwxr- XR-x 2 root Root 4096 August 22 20:30 test-dir [[email protected] ~] # Cd hadoop-2.2.0-src/hadoop-Dist/[[email protected] hadoop-Dist] # ll total usage 12-rw-r -- r -- 1 67974 users 6681 October 7 2013 pom. xmldrwxr-XR-x 7 Root 4096 August 22 20:31 target [[email protected] hadoop-Dist] # ll target/total usage limit 568drwxr-xr-x 2 root Root 4096 August 22 20:30 antrun-RW-r -- r -- 1 Root 1618 August 22 20:30 dist-layout-stitching.sh-rw-r -- r -- 1 Root 635 August 22 20:30 dist-tar-stitching.shdrwxr-xr- X 9 Root 4096 August 22 20:30 hadoop-2.2.0-rw-r -- r -- 1 Root 129814908 August 22 20:31 hadoop-2.2.0.tar.gz-rw-r -- r -- 1 Root 2747 August 22 20:30 hadoop-dist-2.2.0.jar-rw-r -- r -- 1 Root 264969498 August 22 20:31 hadoop-dist-2.2.0-javadoc.jardrwxr-xr-x 2 root root 4096 August 22 20:31 javadoc-bundle-optionsdrwxr-XR-x 2 root Root 4096 August 22 20:30 Maven-archiverdrwxr-XR-x 2 root Root 4096 August 22 20:30 te St-dir [[email protected] hadoop-Dist] # ll target/hadoop-2.2.0 total usage 28drwxr-xr-x 2 root Root 4096 August 22 20:30 bindrwxr-XR-x 3 Root 4096 August 22 20:30 etcdrwxr-XR -x 2 root Root 4096 August 22 20:30 unzip dedrwxr-XR-x 3 Root 4096 August 22 20:30 libdrwxr-XR-x 2 root Root 4096 August 22 20:30 libexecdrwxr-XR-x 2 root Root 4096 August 22 20:30 sbindrwxr-XR-x 4 Root 4096 20:30 share [[email protected] Ha Doop-Dist] # ll target/hadoop-2.2.0/lib total usage 4drwxr-xr-x 2 root Root 4096 August 22 20:30 native [[email protected] hadoop-Dist] # ll target/hadoop-2.2.0/lib/native/ total usage 3600-rw-r -- r -- 1 Root 733290 August 22 20:30 libhadoop. a-RW-r -- 1 Root 1487236 August 22 20:30 libhadooppipes. alrwxrwxrwx 1 Root 18 August 22 20:30 libhadoop. so-> libhadoop. so.1.0.0-rwxr-XR-x 1 Root 412078 August 22 20:30 libhadoop. So.1.0.0-RW-r -- 1 Root 581944 August 22 20:30 libhadooputils. a-RW-r -- 1 Root 273394 August 22 20:30 libhdfs. alrwxrwxrwx 1 Root 16 August 22 20:30 libhdfs. so-> libhdfs. so.0.0.0-rwxr-XR-x 1 Root 181122 August 22 20:30 libhdfs. so.0.0.0 error [info] failed [info] Build failure [info] ------------------------------------------------- ----------------------- [Info] total time: 18:06. 235 s [info] finished at: Fri Aug 22 19:44:48 CST 2014 [info] final memory: 50 m/247 M [info] failed [Error] failed to execute goal Org. apache. maven. plugins: Maven-compiler-plugin: 2.5.1: testcompile (default-testcompile) on Project hadoop-auth: Compilation failure: [Error]/ Root/hadoop-2.2.0-src/hadoop-common-Project/hadoop-auth/src/test/Java/org/Apache/hadoop/security/authentication/client/authenticatortestcase. java: [88,11] error: unable to access abstractlifecycle [Error] cannot find Org. mortbay. component. abstractlifecycle class file [Error]/root/hadoop-2.2.0-src/hadoop-common-Project/hadoop-auth/src/test/Java/org/Apache/hadoop/security/authentication/client/ authenticatortestcase. java: [] error: Unable to access lifecycle [Error]. org. mortbay. component. lifecycle class file [Error]/root/hadoop-2.2.0-src/hadoop-common-Project/hadoop-auth/src/test/Java/org/Apache/hadoop/security/authentication/client/ authenticatortestcase. java: [98,10] error: The [Error] symbol cannot be found: Method start () [Error] location: the server type variable server [Error]/root/hadoop-2.2.0-src/hadoop-common-Project/hadoop-auth/src/test/Java/org/Apache/hadoop/security/authentica Tion/client/authenticatortestcase. java: [] error: cannot find the symbol [Error]-> [help 1] [Error] [Error] to see the full stack trace of the errors, re-run Maven with the-e switch. [Error] re-run Maven using the-x switch to enable full debug logging. [Error] [Error] For more information about the errors and possible solutions, please read the following articles: [Error] [help 1] http://cwiki.apache.org/confluenc E/display/Maven/mojofailureexception [Error] [Error] After correcting the problems, you can resume the build with the command [Error] MVN <goals>-RF: compilation error above hadoop-auth: The environment variable does not take effect! Reconfigure and adjust environment variable path parameters.

650) This. width = 650; "src =" http://s3.51cto.com/wyfs02/M00/47/1D/wKioL1P238bRloN0AAW3wzgQig8297.jpg "Title =" final. PNG "alt =" wkiol1p238brlon0aaw3wzgqig8297.jpg "/>

Reference:

Http://wenku.baidu.com/view/b0d2440831126edb6e1a1011.html

Http://wenku.baidu.com/view/76c933e89ec3d5bbfd0a74fe.htm

This article from "Zheng Yansheng" blog, please be sure to keep this source http://467754239.blog.51cto.com/4878013/1543494

Source code build Hadoop-2.2 version

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.