Fuse-dfs mounting process

Source: Internet
Author: User
Fuse-dfs mounting the entire fuse-dfs mounting process was finally successful. after more than two weeks of intermittent mounting, it took me a week to complete the last mount error, after an HDFSQQ group is added, you can find out where the error occurred. Fuse-dfs mounting process preparation: CentOS6.3, H... fuse-dfs mounting the entire fuse-dfs mounting process was finally successful. after more than two weeks of intermittent mounting, it took me a week to complete the last mount error, I added an hdfs qq group to ask where I was wrong and listened carefully. Fuse-dfs mounting process preparation: CentOS 6.3, Hadoop 1.2.0, jdk 1.6.0 _ 45, fuse 2.8.4, ant 1.9.1 1. install fuse yum install fuse-libs fuse-devel 2. download the ant official website and decompress the package. add vi/etc/profile at the end of the system configuration: export OS _ARCH = i386 # amd64export OS _BIT = 32 #64 export JAVA_HOME =/usr/java/jdk1.6.0 _ 45 export ANT_HOME =/usr/antexport PATH = $ JAVA_HOME/ bin: $ ANT_HOME/bin: $ PATHexport CLASSPATH =.: $ JAVA_HOME/lib/dt. jar: $ JAVA_HOME/lib/tools. jar: $ HADOOP_HOME/lib: $ HADOOP_HOME: $ CLASSPATHexport HADOOP_HOME =/usr/hadoopexport LD_LIBRARY_PATH = $ JAVA_HOME/jre/lib/$ OS _ARCH/server: $ HADOOP_HOME/c ++/Linux-$ OS _ARCH-$ OS _BIT/lib:/usr/local/lib:/usr/lib exit, compile source/etc/profile 4. compile libhdfs cd $ HADOOP_HOMEant compile-c ++-libhdfs-Dlibhdfs = 1-Dcompile. c ++ = 1ln-s c ++/Linux-$ OS _ARCH-$ OS _BIT/lib build/libhdfsant compile-contrib-Dlibhdfs = 1-Dfusedfs = 1 (note: 1. If the compilation fails, the dependency package is missing, yum install automakeautoconf m4 libtool pkgconfig fuse-devel fuse-libs2. gcc will also be installed during installation. If the compilation is successful, the system prompts build successful. the sentence is very pleasant.) 5. environment configuration cd $ HADOOP_HOME/build/contrib/fuse-dfsvi fuse_dfs_wrapper.sh add at the beginning of the file: export JAVA_HOME =/usr/java/jdk1.6.0 _ 45 export HADOOP_HOME =/usr/hadoopexport export =/usr/hadoop/confexport OS _ARCH = imo-export OS _BIT = 32. /change fuse_dfs $ @ to "fuse_dfs @" 6. add permissions $ chmod + x/usr/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh $ chmod + x/usr/hadoop/build/contrib/fus E-dfs/fuse_dfs $ ln-s/usr/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh/usr/local/bin $ ln-s/usr/hadoop/build/ contrib/fuse-dfs/fuse_dfs/usr/local/bin/7. mount mkdir/mnt/dfscd $ HADOOP_HOME/build/contrib/fuse-dfsfuse_dfs_wrapper.sh dfs: // localhost: 9000/mnt/dfs (that's the last step !! Fooled me for a week !! For the link behind fuse_dfs_wrapper.sh, I have always followed the value set in the conf/core-site.xml: hdfs: // localhost: 9000, which has always reported the error fuse-dfs didn't recognize hdfs: // localhost: 9000,-2 fuse-dfs didn't recognize/mnt/dfs,-2) finally, ls/mnt/dfs can see the files in hdfs.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.