Fuse-DFS Mount HDFS record

Source: Internet
Author: User

The latest stable version of hadoop2.2.0 is deployed and installed, and the fuse-DFS compilation tutorial is found on the Internet, but the final failure occurs. The cause is unknown ~~, Error Description: transport endpoint is not connected. Hadoop1.2.1 will be installed and deployed, and the test is successful. The record is as follows:

Use root to complete the following operations:

1. Install the dependency package

apt-get install autoconf automake libtool make gawk g++ ant

2. Uninstall existing fuse and install Fuse

apt-get purge fuseapt-get purge libfuse2tar -zxf fuse-2.9.3.tar.gzcd fuse-2.9.3./configure --prefix=/usr/fusemakemake install

3. Set Environment Variables

ln -s  /usr/fuse/bin/fusermount /usr/bin/

 vi /etc/profile
export FUSE_HOME=/usr/fuseexport OS_ARCH=amd64export OS_BIT=64export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:${HADOOP_HOME}/build/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib:$FUSE_HOME/lib
source /etc/profile

4. Compile libhdfs, fuse-DFS, and HDFS Interfaces

cd $HADOOP_HOME/ant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfs
5. Compile fuse-DFS
ln -s /usr/fuse/include/* /usr/include/ln -s /usr/fuse/lib/libfuse.so /usr/lib/ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
6. Attach HDFS to a local device

Edit fuse_dfs_wrapper.sh, VI $ hadoop_home/build/contrib/fuse-DFS/fuse_dfs_wrapper.sh, and add the environment parameters. Modify the end line as follows:

Export java_home = <your javahome> export hadoop_home = <your hadoophome> export fuse_home =/usr/fuseexport Path = $ path: $ hadoop_home/contrib/fuse_dfsfor F in ls $ hadoop_home/lib /*. jar $ hadoop_home /*. jardoexport classpath = $ classpath: $ fdoneexport OS _arch = amd64export OS _bit = 64 export LD_LIBRARY_PATH = $ java_home/JRE/lib/$ OS _arch/Server: $ {hadoop_home}/build/C ++/Linux-$ OS _arch-$ OS _bit/lib:/usr/local/lib:/usr/lib: $ fuse_home/lib
End modification:

fuse_dfs [email protected]

Modify fuse_dfs_wrapper.sh permission

chmod 755  ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh

Create Link

ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/binln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/ 


Mount HDFS to a local device

mkdir -p /mnt/dfsfuse_dfs_wrapper.sh dfs://localhost:9005 /mnt/dfs
Note: localhost: 9005 is the value of FS. Default. name in the hadoop core-site.xml configuration file, followed by "DFS"

Uninstall HDFS

umount /mnt/dfs


Appendix:

1. When Step 5 is executed, some columns such as undefined reference to fuse_get_context are incorrect. Copy and execute the wrong command, put the-L and-l parameters in the group, and run the undefined reference to symbol' [email protected] @ glibc_2.2.5 'error, then add the-LM parameter to solve the problem. Then re-execute the compilation

2. commons-logging # commons-logging; 1.0.4: Not Found error. Modify the commons-logging.version of the Ivy/libraries. properties file = 1.1.1.

3. You must enter the correct address, including the host name and port number, during mounting. Otherwise, a read/write error occurs.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.