Brief introduction
When running Hadoop or spark (call HDFs, etc.), the error "Unable to load Native-hadoop library for your platform" is not actually loading the local library
Solutions
1. Whether the environment variable is set (set but not yet try the second step)
Export hadoop_common_lib_native_dir= $HADOOP _home/lib/native
Export hadoop_opts= "-djava.library.path= $HADOOP _home/lib/native"
2. If the default libhadoop.so.1.0.0 (lib/native) is 32 bits, and the machine is 64 bits, you need to compile Hadoop and replace the files in Lib
Compiling installation options
MVN Clean package-pdist-dtar-dmaven.javadoc.skip=true-dskiptests-fail-at-end-pnative
If prompted to ' Libprotoc 2.5.0
1 sudoApt-getInstall-YGCCg++ MakeMaven CMake zlib Zlib1g-dev libcurl4-openssl-Dev2Curl-#-O https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz3 Gunzipprotobuf-2.5.0.Tar. GZ4 Tar-XVF protobuf-2.5.0.Tar 5CD protobuf-2.5.0 6./configure--prefix=/usr7 Make 8 sudo Make Install
9 sudo ldconfig
If it is in the Ubuntu14.04, prompt libtool error, then use/usr/bin/libtool to replace the Libtool in the file
Compiled files
Set the logging level for Hadoop
Export Hadoop_root_logger=debug,console
Results
Reference:
Http://stackoverflow.com/questions/30702686/building-apache-hadoop-2-6-0-throwing-maven-error
Http://stackoverflow.com/questions/19943766/hadoop-unable-to-load-native-hadoop-library-for-your-platform-warning
http://stackoverflow.com/questions/19556253/ Trunk-doesnt-compile-because-libprotoc-is-old-when-working-with-hadoop-under-ec
hadoop-unable to load Native-hadoop library for your platform