fuse-dfs掛載全過程 fuse-dfs掛載終於成功了,斷斷續續弄了兩周多,而最後的一步掛載出錯花了我一周時間,加了個HDFS QQ群問了之後才知道哪裡弄錯了,且聽細細道來。 fuse-dfs掛載全過程 準備工作: CentOS 6.3,Hadoop 1.2.0, jdk 1.6.0_45,fuse 2.8.4,ant 1.9.1 1.安裝fuse yum install fuse fuse-libs fuse-devel 2.安裝ant 官網下載,解壓 3.系統配置 vi /etc/profile 最後添加: export OS_ARCH=i386 #如果是64位機器填amd64export OS_BIT=32 #64export JAVA_HOME=/usr/java/jdk1.6.0_45export ANT_HOME=/usr/antexport PATH=$JAVA_HOME/bin:$ANT_HOME/bin:$PATHexport CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$HADOOP_HOME/lib:$HADOOP_HOME:$CLASSPATHexport HADOOP_HOME=/usr/hadoopexport LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:$HADOOP_HOME/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib 退出,編譯 source /etc/profile 4.編譯libhdfs cd $HADOOP_HOMEant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfsant compile-contrib -Dlibhdfs=1 -Dfusedfs=1 (提示:1.如果編譯沒通過,缺少依賴包,yum install automakeautoconf m4 libtool pkgconfig fuse fuse-devel fuse-libs2.在安裝的過程中還要安裝gcc 。編譯成功會提示 build successful,看到這句心情非常愉悅) 5.環境配置 cd $HADOOP_HOME/build/contrib/fuse-dfsvi fuse_dfs_wrapper.sh在檔案最前面添加:export JAVA_HOME=/usr/java/jdk1.6.0_45export HADOOP_HOME=/usr/hadoopexport HADOOP_CONF_DIR=/usr/hadoop/confexport OS_ARCH=i386export OS_BIT=32把最後一句“./fuse_dfs$@” 改成 “fuse_dfs@” 6.添加許可權 $chmod +x /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh $chmod +x /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs $ln -s /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin $ln -s /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/ 7.掛載 mkdir /mnt/dfscd $HADOOP_HOME/build/contrib/fuse-dfsfuse_dfs_wrapper.sh dfs://localhost:9000 /mnt/dfs(就是這最後一步!!糊弄了我一周!!關於fuse_dfs_wrapper.sh後面跟的這個連結,我一直遵循)conf/core-site.xml裡設定的value值:hdfs://localhost:9000,一直報錯fuse-dfs didn't recognize hdfs://localhost:9000,-2 fuse-dfs didn't recognize /mnt/dfs,-2 ) 最後ls /mnt/dfs就可以看到hdfs裡的檔案了