Hadoop's local library (Native Libraries) and related issues summary

Source: Internet
Author: User
Tags builtin file copy
body turn from: http://blog.sina.com.cn/s/blog_3d9e90ad0102wqrp.html (not tested, my getconf Long_bit returns 32, and file libhadoop.so.1.0.0 return 64, opposite)Introduction to Hadoop's local library (Native Libraries)

Hadoop is developed in the Java language, but there are some requirements and actions that are not appropriate for Java, so the concept of a local library (Native Libraries) is introduced, and Hadoop can perform some operations more efficiently with local libraries.

Currently in Hadoop, local libraries are applied on file compression: Zlib gzip

With both of these compression modes, HADOOP loads the local library from the $hadoop_home/lib/native/linux-* directory by default.

If the load succeeds, the output is:

DEBUG util. Nativecodeloader-trying to load the custom-built Native-hadoop library ...
INFO util. Nativecodeloader-loaded The Native-hadoop Library

If the load fails, the output is:

INFO util. nativecodeloader-unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable

You can set whether to use a local library in the configuration file Core-site.xml of Hadoop:


Hadoop.native.lib
True
Should native Hadoop libraries, if present, be used.

The default configuration for Hadoop is to enable the local library.

In addition, you can set the location in the environment variable to use the local library:

Export Java_library_path=/path/to/hadoop-native-libs

Sometimes you will find that the local library that comes with Hadoop does not work, and in this case you need to compile the local library yourself. Under the $hadoop_home directory, use the following command:

Ant Compile-native

After compiling, you can find the corresponding file in the $hadoop_home/build/native directory, then specify the path of the file or move the compiled file to the default directory.

The official Check order: Check

Official Link: http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html#Native_Hadoop_Library

Nativelibrarychecker is a tool-to-check whether native libraries is loaded correctly. You can launch Nativelibrarychecker as follows:

$ Hadoop checknative-a
14/12/06 01:30:45 WARN bzip2. bzip2factory:failed to Load/initialize NATIVE-BZIP2 library system-native, would use Pure-java version
14/12/06 01:30 : Zlib INFO. Zlibfactory:successfully Loaded & initialized native-zlib library native library checking:
hadoop:true/home/oz awa/hadoop/lib/native/libhadoop.so.1.0.0
zlib:true/lib/x86_64-linux-gnu/libz.so.1
snappy:true/usr/lib/ Libsnappy.so.1
lz4:true revision:99
bzip2:false
For more articles on Hadoop, refer to: http://www.cnblogs.com/gpcuster/tag/Hadoop/

Hadoop native Local library issues summary

Recently, the intention to use HBase to build a table with snappy compression, encountered some Hadoop local library problems. In fact, these problems are always there, but it does not affect the normal use, it does not attract attention. This time we hope to solve the following problems completely:

Issue one: The following log appears when executing start-dfs.sh

Xxxx:java HotSpot (TM) 64-bit Server VM warning:you has loaded library/usr/local/hadoop-2.4.0/lib/native/libhadoop.so W Hich might has disabled stack guard. The VM would try to fix the stack guard now.

Xxxx:it ' s highly recommended that you fix the library with ' execstack-c ', or link It with '-Z noexecstack '.

This is because the version of the local library provided by the official website is 32-bit, it cannot be executed in a 64-bit host environment。 Need to download HADOOP source code to compile (how to compile the source can be online search), after the successful compilation, find native under the file copy to the ${hadoop_home}/lib/native directory.

Issue two: The following log appears when executing start-dfs.sh

WARN util. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable

All the articles found on the Web are said to include the following two lines of configuration in hadoop-env.sh:

Export Hadoop_common_lib_native_dir=${hadoop_home}/lib/native

Export hadoop_opts= "-djava.library.path=${hadoop_home}/lib/"

However, in the test process, adding the above configuration will also prompt the alarm message, indicating that the local library was not loaded successfully.

Turn on debug:

Export Hadoop_root_logger=debug,console

Execute start-dfs.sh and find the following log:

DEBUG util. nativecodeloader:failed to load Native-hadoop with Error:java.lang.UnsatisfiedLinkError:no Hadoop in Java.library.path

It can be seen from the log that the Hadoop library is not in the directory configured by Java.library.path and should be a problem with the path to the Java.library.path configuration. Reconfigure in hadoop-env.sh:

Export Hadoop_common_lib_native_dir=${hadoop_home}/lib/native

Export hadoop_opts= "-djava.library.path=${hadoop_home}/lib/native/"

The start-dfs.sh is executed and the alarm information is no longer displayed. After testing, only export hadoop_opts can solve the problem.

Verify that the local library is loaded successfully: Hadoop checknative

15/08/18 10:31:17 INFO bzip2. Bzip2factory:successfully Loaded & initialized NATIVE-BZIP2 library system-native

15/08/18 10:31:17 INFO zlib. Zlibfactory:successfully Loaded & initialized Native-zlib Library

Native Library checking:

hadoop:true/usr/local/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0

Zlib:true/lib64/libz.so.1

Snappy:true/usr/local/hadoop-2.4.0/lib/native/linux-amd64-64/libsnappy.so.1

Lz4:true revision:99

Bzip2:true/lib64/libbz2.so.1

The above indicates that the local library has been loaded successfully.


To install the snappy compression configuration, you can refer to the following two articles:

http://www.micmiu.com/bigdata/hadoop/hadoop-snappy-install-config/

Http://www.cnblogs.com/shitouer/archive/2013/01/14/2859475.html


According to the above article configuration HBase, when the table will be stuck, and regionserver error: ioexception:compression algorithm ' snappy ' previously failed test


Search on the Internet a lot of articles, are all-in-one copy sticky, always can not solve the above problems. Fortunately, finally found this article http://blackwing.iteye.com/blog/1943575, although the description has always puzzled me, but after groping finally solved my problem, thank the blogger. The final steps to configure HBase are:

1. Copy the local repository of Hadoop and snappy to the $hbase_home/lib/native/linux-amd64-64/directory with the following screenshot:


2. Add the following configuration in hbase-env.sh:

Export ld_library_path= $LD _library_path: $HADOOP _home/lib/native/linux-amd64-64/

Export hbase_library_path= $HBASE _library_path: $HBASE _home/lib/native/linux-amd64-64/

I hope it helps a little bit about the same problem with children's shoes

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.