The latest stable version of hadoop2.2.0 is deployed and installed, and the fuse-dfs compilation tutorial is found on the Internet, but the final failure occurs. The cause is unknown ~~, Error Description: Transport endpoint is not connected. Hadoop1.2.1 will be installed and deployed, and the test is successful. The record is as follows:
Use root to complete the following operations:
1. Install the dependency package
apt-get install autoconf automake libtool make gawk g++ ant
2. Uninstall existing fuse and install fuse
apt-get purge fuseapt-get purge libfuse2tar -zxf fuse-2.9.3.tar.gzcd fuse-2.9.3./configure --prefix=/usr/fusemakemake install
3. Set Environment Variables
ln -s /usr/fuse/bin/fusermount /usr/bin/
vi /etc/profile
export FUSE_HOME=/usr/fuseexport OS_ARCH=amd64export OS_BIT=64export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:${HADOOP_HOME}/build/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib:$FUSE_HOME/lib
source /etc/profile
4. Compile libhdfs, fuse-dfs, and hdfs Interfaces
cd $HADOOP_HOME/ant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfs
5. Compile fuse-dfs
ln -s /usr/fuse/include/* /usr/include/ln -s /usr/fuse/lib/libfuse.so /usr/lib/ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
6. Attach hdfs to a local device
Edit fuse_dfs_wrapper.sh, vi $ HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh, and add the environment parameters. Modify the end line as follows:
Export JAVA_HOME = <your javahome> export HADOOP_HOME = <your hadoophome> export FUSE_HOME =/usr/fuseexport PATH = $ PATH: $ HADOOP_HOME/contrib/fuse_dfsfor f in ls $ HADOOP_HOME/lib /*. jar $ HADOOP_HOME /*. jardoexport CLASSPATH = $ CLASSPATH: $ fdoneexport OS _ARCH = amd64export OS _BIT = 64 export LD_LIBRARY_PATH = $ JAVA_HOME/jre/lib/$ OS _ARCH/server: $ {HADOOP_HOME}/build/c ++/Linux-$ OS _ARCH-$ OS _BIT/lib:/usr/local/lib:/usr/lib: $ FUSE_HOME/lib
End modification:
fuse_dfs $@
Modify fuse_dfs_wrapper.sh permission
chmod 755 ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh
Create Link
ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/binln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/
Mount hdfs to a local device
mkdir -p /mnt/dfsfuse_dfs_wrapper.sh dfs://localhost:9005 /mnt/dfs
Note: localhost: 9005 is the value of fs. default. name in the hadoop core-site.xml configuration file, followed by "dfs"
Uninstall hdfs
umount /mnt/dfs
Appendix:
1. When Step 5 is executed, some columns such as undefined reference to fuse_get_context are incorrect. Copy the execution error command and place the-L and-l parameters in the group. An undefined reference to symbol'floor @ GLIBC_2.2.5 error may occur when running the command, then add the-lm parameter to solve the problem. Then re-execute the compilation
2. commons-logging # commons-logging; 1.0.4: not found error. Modify the commons-logging.version of the ivy/libraries. properties file = 1.1.1.
3. You must enter the correct address, including the host name and port number, during mounting. Otherwise, a read/write error occurs.