Hadoop2.7.3 compilation supports 5 types of compression

Source: Internet
Author: User
Tags curl gettext openssl ssh

Guide:
After you follow the steps in the blog, you can support 5 types of local compression after Hadoop compiles: zlib, snappy, lz4, bzip2, OpenSSL

The author's environment is configured as: JDK1.8, hadoop2.7.3,hive2.3.0, you can also use other versions, note that version compatibility can be 1. Installing dependent tools 1. Install related tools

As the author is the smallest installation, so many tools are not installed, specifically need to install which tools, also not very clear, so installed a lot. It is recommended that you use the root user installation.

# yum-y  Install make gcc gcc-c++ gcc-g77 Flex Bison file Libtool libtool-libs autoconf kernel-devel libjpeg libjpeg-d  Evel libpng libpng-devel libpng10 libpng10-devel gd gd-devel freetype freetype-devel libxml2 libxml2-devel zlib zlib-devel Glib2 glib2-devel bzip2 bzip2-devel libevent libevent-devel ncurses ncurses-devel Curl curl-devel e2fsprogs E2fsprogs-dev El krb5 krb5-devel libidn libidn-devel OpenSSL openssl-devel gettext gettext-devel ncurses-devel gmp-devel pspell-devel un Zip libcap lsof build-essential cmake 1g-dev pkg-config libssl-dev lzo-devel fuse fuse-devel zlib1g-dev libprotobuf-dev PR Otobuf-compiler Snappy Libbz2-dev Libjansson-dev Libfuse-dev

If you are still missing after installing these tools, you can use the following command to detect the installation

# yum-y Groupinstall "Development Tools"
2. Installing Protobuf

Unzip the build installation

# TAR-ZXVF protobuf-2.5.0.tar.gz
# cd/home/hadoop/protobuf-2.5.0
#/configure--prefix=/home/hadoop/ protobuf/  
# make && make install

Environment variables

# vim ~/.BASHRC
# source ~/.BASHRC
Export Path=/home/hadoop/protobuf/bin: $PATH

If the following error is reported when compiling the installation
Libtool:install:error:cannot install ' libaprutil-1.la ' to a directory
Cause: It may have been previously installed./configure, causing the installation file to be less "clean"
Workaround:
(1). Execute command Make clean
(2). Remove the directory that you just compiled the installation protobuf, re-unzip the installation to compile 3. Install snappy

Unzip the build installation

# TAR-ZXVF snappy-1.1.1.tar.gz
# cd snappy-1.1.1
#/configure
# make && make install

Check to see if the snappy installation is complete

ll/usr/local/lib/| grep Snappy

2.hadoop Compilation 1. Configure Maven

Unzip, rename, configure environment variables

# tar-zvxf apache-maven-3.3.9-bin.tar.gz
# mv apache-maven-3.3.9-bin.tar.gz Maven3
# vim ~/.BASHRC
Export Maven_home=/home/hadoop/maven3
export m2_home= $MAVEN _home

Reload the file to make the environment variable effective immediately

# source ~/.BASHRC

Create the. m2 file, copy the setting to the directory

# cd/home/hadoop
# mkdir. m2
2. Compiling Hadoop

Adding Hadoop local libraries to environment variables before Hadoop is compiled

Export hadoop_opts= "-djava.library.path=${hadoop_home}/lib/native"

Download the Hadoop source package from the website and unzip

$ TAR-ZXVF hadoop-2.7.3-src.tar.gz
$ cd hadoop-2.7.3-src/

Enter the following command to start compiling, the compilation process is long, wait patiently

$ mvn PACKAGE-DSKIPTESTS-PDIST,NATIVE-DTAR-DREQUIRE.SNAPPY-E-X

If you see build SUCCESS, and there is no exception information, the Hadoop has been compiled successfully

After compiling, you can see the compressed package of Hadoop

$ cd/home/hadoop/hadoop-2.7.3-src/hadoop-dist/target

The author configured the pseudo-distributed Hadoop, unzip the compiled Hadoop compression package, replace the inside of the configuration file can be started to enter the following command

$ HDFs Namenode-format    #只在第一次启动时执行
$ start-dfs.sh
$ start-yarn.sh
$ mr-jobhistory-daemon.sh Start Historyserver

During startup, always prompt for password, the solution is to configure SSH login, enter the following command

$ su Root
# ssh-keygen-t RSA
# CP id_rsa.pub Authorized_keys
# exit

Check the local library for Hadoop

$ Hadoop checknative

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.