Unable to load Native-hadoop library for your platform (resolved)

Source: Internet
Author: User
Tags builtin hdfs dfs log4j

1, increase debugging information to find problems

2. Two ways to solve unable to load Native-hadoop library for you platform

Attached: libc/glibc/glib Introduction

Reference:

1. http://my.oschina.net/swuly302/blog/515853 "Route 66th: Hadoop Unable to load Native-hadoop library for Your platform "

2, http://blog.sina.com.cn/s/blog_4eca88390102vn86.html

Increase debug information to find problems

There is a warning when the HDFS command is executed:

WARN util. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable

We add debug information to see where the problem is.

There are two ways to add debugging information,

1. Add one of the following commands before executing the command:

Export Hadoop_root_logger=debug,console

Like what:

[[email protected] ~]$ export Hadoop_root_logger=debug,console[[email protected] ~]$ HDFs dfs-ls/

2, in the $hadoop_conf_dir/log4j.properties (hadoop2.6.0 path is/home/wangqi/software/hadoop-2.6.0/etc/hadoop/ log4j.properties) file, add the following code:

Log4j.logger.org.apache.hadoop.util.nativecodeloader=debug

I chose the second one (because the next solution also needed to move the file).

Then execute a command to view the debug information as follows:

[[email protected] ~]$ HDFs dfs-ls/16/01/05 15:05:49 DEBUG util. Nativecodeloader:trying to load the custom-built Native-hadoop library ... 16/01/05 15:05:49 DEBUG util. nativecodeloader:failed to load Native-hadoop with Error:java.lang.UnsatisfiedLinkError:no Hadoop in Java.library.path 16/01/05 15:05:49 DEBUG util. nativecodeloader:java.library.path=/home/wangqi/software/hadoop-2.6.0/lib16/01/05 15:05:49 WARN util. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where Applicablefou nd 1 itemsdrwxr-xr-x-wangqi supergroup 0 2016-01-05 10:07/user

The problem came out. For this problem, the solution on the net is to add two soft links, namely:

Ln-s libhadoop.so.1.0.0 libhadoop.soln-s libhdfs.so.0.0.0 libhdfs.so

But I added these two soft links, the problem is still.

Two ways to solve unable to load Native-hadoop library for you platform

Let's look at the glibc version of libhadoop.so.1.0.0, as follows:

[Email protected] native]$ ldd libhadoop.so./libhadoop.so:/lib64/libc.so.6:version ' glibc_2.14 ' not found (required by ./libhadoop.so) Linux-vdso.so.1 = (0x00007fffb53ff000) libdl.so.2 =/lib64/libdl.so.2 (0x00007fbb73129000) libc.so.6 =/lib64/libc.so.6 (0x00007fbb72d94000)/lib64/ld-linux-x86-64.so.2 (0x000000399b800000)

The problem has actually appeared, libhadoop.so need the glibc version is glibc_2.14, not found.

Let's look at the glibc version of the current system as follows:

[Email protected] native]$ ldd--versionldd (GNU libc) 2.12Copyright (C) free software Foundation, Inc.this was free Software; See the source for copying conditions. There is nowarranty; Not even to merchantability or FITNESS for A particular PURPOSE. Written by Roland McGrath and Ulrich drepper.

As you can see, the glibc version of the current system is 2.12.

So how do you solve this warning? Available in two ways.

1, Configuration $hadoop_conf_dir/log4j.properties/ Log4j.properties file to ignore this warning, the log4j.logger.org.apache.hadoop.util.nativecodeloader=debug that will be configured earlier is:

Log4j.logger.org.apache.hadoop.util.nativecodeloader=error

2, upgrade the glib version of the system.

Download glibc-2.14.tar.bz2, Address: http://ftp.ntu.edu.tw/gnu/glibc/

Download glibc-linuxthreads-2.5.tar.bz2, Address: http://ftp.ntu.edu.tw/gnu/glibc/

The installation steps are as follows:

1) put the downloaded bz2 package in a folder

[Email protected] download]$ lsglibc-2.14.tar.bz2 glibc-linuxthreads-2.5.tar.bz2

2) Unzip the glibc-2.14.tar.bz2 to the current directory

[Email protected] download]$ TAR-XJVF glibc-2.14.tar.bz2[[email protected] download]$ lsglibc-2.14.tar.bz2 glibc-2.14 glibc-linuxthreads-2.5.tar.bz2

3) Unzip the glibc-linuxthreads-2.5.tar.bz2 into the glibc-2.14

[Email protected] download]$ CD Glibc-2.14[[email protected] glibc-2.14]$ TAR-XJVF. /glibc-linuxthreads-2.5.tar.bz2

At this point, the glibc-2.14 directory will have more than two folders, namely Linuxthreads and linuxthreads_db

4) go to the top level directory and execute the following command:

Go back to the top level directory [[email protected] glibc-2.14]$ CD. Add optimization switch, otherwise error ' #error ' glibc cannot be compiled without optimization "' [[email protected] download]$ export cflags="-G- O2 "

5) Execute the following command:

[Email protected] download]$ glibc-2.14/configure--prefix=/usr--disable-profile--enable-add-ons--with-headers=/ Usr/include--with-binutils=/usr/bin--disable-sanity-checks

6) Execute make

Compile, execute for a long time (5-10 minutes), possibly error, error and re-execute [[email protected] download]$ make

7) Execute make install

Installation must be performed by root user and executed for a long time [[email protected] download]$ sudo make install

8) Use the command ls-l/lib/libc.so.6 to see if the upgrade was successful

[Email protected] download]$ ll/lib64/libc.so.6lrwxrwxrwx 1 root root one of the 09:24/lib/libc.so.6, libc-2.14.so

9) Restart Hadoop

[Email protected] ~]$ stop-dfs.sh[[email protected] ~]$ stop-yarn.sh

10) Execute an HDFS command to discover that the local repository was successfully loaded

[[email protected] ~]$ export Hadoop_root_logger=debug,console[[email protected] ~]$ HDFs dfs-ls/16/01/05 20:02:40 DEBUG Util. Shell:setsid exited with exit code 016/01/05 20:02:41 DEBUG util. Nativecodeloader:trying to load the custom-built Native-hadoop library ... 16/01/05 20:02:41 DEBUG util. nativecodeloader:loaded the Native-hadoop library16/01/05 20:02:42 DEBUG IPC. ProtobufRpcEngine:Call:getListing took 2msFound 2 itemsdrwxrwx----wangqi supergroup 0 2016-01-05 19:40/tmpdrwxr-xr-x -wangqi supergroup 0 2016-01-05 19:41/user

Attached: libc/glibc/glib

GLIBC and libc are all C function libraries under Linux.

LIBC is the Ansic function library under Linux, and glibc is the Gunc function library under Linux.

GLib is written in C some utilities, that is, C's tool library, and libc/glibc no relationship.

GLIBC is the implementation of the C standard library under Linux, the GNU C libraries. GLIBC itself is the GNU-owned C standard library, and later became the standard C library of Linux, while the original Linux standard C library Linux libc is no longer maintained.

glibc. So files in the/lib directory are libc.so.6.

Two ways to view the glibc version of the current system:

[Email protected] ~]# ll/lib64/libc.so.6 lrwxrwxrwx. 1 root root 00:27/lib64/libc.so.6 Oct-libc-2.12.so
[Email protected] ~]# ldd--versionldd (GNU libc) 2.12Copyright (C) free software Foundation, inc.this are free softw is; See the source for copying conditions. There is nowarranty; Not even to merchantability or FITNESS for A particular PURPOSE. Written by Roland McGrath and Ulrich drepper.

Both of these methods can see that the glibc version of the current system is 2.12.

Unable to load Native-hadoop library for your platform (resolved)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.