Hadoop2.2.0 solution to NativeLibraries errors

Source: Internet
Author: User
Tags builtin

Problem description

When Hadoop is installed for testing and learning, the following problems are encountered. Hadoop is 2.2.0, and the operating system is Oracle linux 6.3 64-bit.

[Hadoop @ hadoop01 input] $ hadoop dfs-put./in

DEPRECATED: Use of this script to executehdfs command is deprecated.

Instead use the hdfs command for it.

13/10/24 15:12:53 WARNutil. NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable

Put: 'in': No such file or directory

The last line "put: 'in': No such file or directory" is ignored first. It must be caused by a syntax command problem.

Solve the problem of "WARN util. NativeCodeLoader: Unable to loadnative-hadoop library for your platform... using builtin-java classes whereapplicable"

Note: My hadoop environment is compiled by myself. Because of the 64-bit operating system, hadoop2.2.0 seems to have only 32-bit software. For 64-bit compilation, see:

Solution 1. Enable debug

[Hadoop @ hadoop01 input] $ export HADOOP_ROOT_LOGGER = DEBUG, console

[Hadoop @ hadoop01 input] $ hadoop dfs-put./in

DEPRECATED: Use of this script to executehdfs command is deprecated.

Instead use the hdfs command for it.

 

13/10/24 16:11:31 DEBUG util. Shell: setsidexited with exit code 0

13/10/24 16:11:31 DEBUGlib. mutableMetricsFactory: field org. apache. hadoop. metrics2.lib. mutableRateorg. apache. hadoop. security. userGroupInformation $ UgiMetrics. loginSuccess withannotation @ org. apache. hadoop. metrics2.annotation. metric (valueName = Time, value = [Rate of successful kerberos logins and latency (milliseconds)], about =, type = DEFAULT, always = false, sampleName = Ops)

13/10/24 16:11:31 DEBUGlib. mutableMetricsFactory: field org. apache. hadoop. metrics2.lib. mutableRateorg. apache. hadoop. security. userGroupInformation $ UgiMetrics. loginFailure withannotation @ org. apache. hadoop. metrics2.annotation. metric (valueName = Time, value = [Rate of failed kerberos logins and latency (milliseconds)], about =, type = DEFAULT, always = false, sampleName = Ops)

13/10/24 16:11:31 DEBUGimpl. MetricsSystemImpl: UgiMetrics, User and group related metrics

13/10/24 16:11:32 DEBUGsecurity. Groups: Creating new Groupsobject

13/10/24 16:11:32 DEBUGutil. NativeCodeLoader: Trying to load the custom-built native-hadoop library...

13/10/24 16:11:32 DEBUGutil. NativeCodeLoader: Failed to load native-hadoopwith error: java. lang. UnsatisfiedLinkError: no hadoop in java. library. path

13/10/24 16:11:32 DEBUGutil. nativeCodeLoader: java. library. path =/usr/java/jdk1.7.0 _ 45/lib:/app/hadoop/hadoop-2.2.0/lib/native:/app/hadoop/hadoop-2.2.0/lib/native

13/10/24 16:11:32 WARNutil. NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable

13/10/24 16:11:32 DEBUG security. JniBasedUnixGroupsMappingWithFallback: Falling back to shell based

13/10/24 16:11:32 DEBUGsecurity. JniBasedUnixGroupsMappingWithFallback: Group mappingimpl = org. apache. hadoop. security. ShellBasedUnixGroupsMapping

13/10/24 16:11:32 DEBUG security. Groups: Group mappingimpl = org. apache. hadoop. security. JniBasedUnixGroupsMappingWithFallback; cacheTimeout = 300000

13/10/24 16:11:32 DEBUGsecurity. UserGroupInformation: hadoop login

13/10/24 16:11:32 DEBUGsecurity. UserGroupInformation: hadoop login commit

13/10/24 16:11:32 DEBUGsecurity. UserGroupInformation: using local user: UnixPrincipal: hadoop

13/10/24 16:11:32 DEBUGsecurity. UserGroupInformation: UGI loginUser: hadoop (auth: SIMPLE)

13/10/24 16:11:33 DEBUGhdfs. BlockReaderLocal: dfs. client. use. legacy. blockreader. local = false

13/10/24 16:11:33 DEBUGhdfs. BlockReaderLocal: dfs. client. read. shortcircuit = false

13/10/24 16:11:33 DEBUGhdfs. BlockReaderLocal: dfs. client. domain. socket. data. traffic = false

13/10/24 16:11:33 DEBUGhdfs. BlockReaderLocal: dfs. domain. socket. path =

13/10/24 16:11:33 DEBUGimpl. MetricsSystemImpl: StartupProgress, NameNode startup progress

13/10/24 16:11:33 DEBUG retry. RetryUtils: multipleLinearRandomRetry = null

13/10/24 16:11:33 DEBUG ipc. server: rpcKind = RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass = classorg. apache. hadoop. ipc. protobufRpcEngine $ RpcRequestWrapper, rpcInvoker = org. apache. hadoop. ipc. protobufRpcEngine $ Server $ ProtoBufRpcInvoker @ 2e41d9a2

13/10/24 16:11:34 DEBUGhdfs. BlockReaderLocal: Both short-circuit local reads and UNIX domain socketare disabled.

13/10/24 16:11:34 DEBUG ipc. Client: Theping interval is 60000 ms.

13/10/24 16:11:34 DEBUG ipc. Client: Connecting to localhost/127.0.0.1: 8020

13/10/24 16:11:34 DEBUG ipc. Client: IPCClient (2141757401) connection to localhost/127.0.0.1: 8020 from hadoop: starting, having connections 1

13/10/24 16:11:34 DEBUG ipc. Client: IPCClient (2141757401) connection to localhost/127.0.0.1: 8020 from hadoop sending #0

13/10/24 16:11:34 DEBUG ipc. Client: IPCClient (2141757401) connection to localhost/127.0.0.1: 8020 from hadoop gotvalue #0

13/10/24 16:11:34 DEBUGipc. ProtobufRpcEngine: Call: getFileInfo took 82 ms

13/10/24 16:11:34 DEBUG ipc. Client: IPCClient (2141757401) connection to localhost/127.0.0.1: 8020 from hadoop sending #1

13/10/24 16:11:34 DEBUG ipc. Client: IPCClient (2141757401) connection to localhost/127.0.0.1: 8020 from hadoop gotvalue #1

13/10/24 16:11:34 DEBUGipc. ProtobufRpcEngine: Call: getFileInfo took 4 ms

Put: '.': No such file or directory

13/10/24 16:11:34 DEBUG ipc. Client: Stopping client

13/10/24 16:11:34 DEBUG ipc. Client: IPCClient (2141757401) connection to localhost/127.0.0.1: 8020 from hadoop: closed

13/10/24 16:11:34 DEBUG ipc. Client: IPCClient (2141757401) connection to localhost/127.0.0.1: 8020 from hadoop: stopped, remaining connections 0

 

Errors in the preceding debug:

Failed to load native-hadoop with error: java. lang. UnsatisfiedLinkError: no hadoop in java. library. path

To solve this problem, we have tried many ways, many of which are modifying environment variables. All of them are helpless.

For more details, please continue to read the highlights on the next page:

Build a Hadoop environment on Ubuntu 13.04

Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1

Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)

Configuration of Hadoop environment in Ubuntu

Detailed tutorial on creating a Hadoop environment for standalone Edition

Build a Hadoop environment (using virtual machines to build two Ubuntu systems in a Winodws environment)

  • 1
  • 2
  • Next Page

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.