Hadoop Shell does not report this error when running, because I have re-compiled the source files on the 64-bit machine and copied so files to the native directory of Hadoop, and the environment variables are set correctly, so Hadoop itself is not a problem.
However, this issue will be reported when launching the spark-related shell.
After the search, found that someone asked this question, I tried to answer the operation, the problem solved.
The libhadoop.so and libsnappy.so two files are missing primarily from the JRE directory. Specifically, Spark-shell relies on the jdk,libhadoop.so and libsnappy.so two files that Scala,scala relies on java_home should be placed under $JAVA_HOME/JRE/LIB/AMD64.
These two so:libhadoop.so and libsnappy.so. The previous so can be found under Hadoop_home, such as hadoop\lib\native. The second libsnappy.so needs to download a snappy-1.1.0.tar.gz, and then the./configure,make is compiled and compiled successfully under the. Libs folder.
This issue does not occur when the two files are ready to start the spark shell again.
Also found another solution is:
Ld_library_path Spark sets This environment variable and should have the same effect. I haven't tried for the moment.
I temporarily only libhadoop.so copy past, did not make libsnappy.so may be no use of reason, also no error.
What happens when spark loads a Hadoop local library and fails to load?