Install and configure dfs-fuse for Hadoop under Ubuntu

Source: Internet
Author: User
1. Install fuse: first go to fuse

1. Install fuse:

First go to the terminal
After the version, run the following three commands to install fuse:
./Configure -- prefix =/usr

Make

Make install
After the installation is complete, run the following command to test whether the fuse installation is successful:

Mkdir/tmp/fsue
Fuse/example>./hello/tmp/fuse
Cat/tmp/fuse/hello If hello world is displayed! It indicates the operation is successful.

2. Install ant:

Check whether the current system has ant. If not, use sudo apt-get install ant to install it.

3. Hadoop installation and configuration:

I am using hadoop's Pseudo-Distributed Mode. The configuration is as follows:

Fs. default. name

Hdfs: // localhost/

Dfs. replication

1

Mapred. job. tracker

Localhost: 8021

Configure hadoop and start hadoop, especially hdfs, for future testing.

4. Set Environment Variables

I wrote it directly ~ /. Profile for later use

Export JAVA_HOME = java root directory, set according to the system
Export HADOOP_HOME = hadoop root directory, which is set according to the system

Export OS _ARCH = i386 (If your system is 64-bit, it should be written as amd64. In fact, it is mainly set for the convenience of subsequent path settings, the $ HADOOP_HOME/c ++/directory contains two sub-directories: 64-bit and 32-bit. You can select which directory to use based on your system, there are also related java Directories)
Export OS _BIT = 32 // if the system is 64-bit, write 64
Export LD_LIBRARY_PATH = $ JAVA_HOME/jre/lib/$ OS _ARCH/server: $ {HADOOP_HOME}/build/c ++/Linux-$ OS _ARCH-$ OS _BIT/lib: /usr/local/lib:/usr/lib

5. Create libhdfs:

Cd $ HADOOP_HOME/
Ant compile-c ++-libhdfs-Dlibhdfs = 1-Dcompile. c ++ = 1

Ln-s c ++/Linux-$ OS _ARCH-$ OS _BIT/lib build/libhdfs

6. fuse-dfs

Cd $ HADOOP_HOME

Ant compile-contrib-Dlibhdfs = 1-Dfusedfs = 1

If an error is reported, check whether the related software is installed. Here, the error is that automake is not installed. You can use apt-get to install it.

7. fuse Configuration
Place the following content at the top of the $ HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh script.

Export JAVA_HOME = java root directory, set according to the system
Export HADOOP_HOME = hadoop root directory, which is set according to the system

Export OS _ARCH = i386 // if the system is 64-bit, write amd64

Export OS _BIT = 32 // if the system is 64-bit, write 64
Grant $ HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh executable permission chmod + x $ HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh

8. Test

Mkdir/tmp/dfs (used to map the hdfs directory to/tmp/dfs, for file operations under/tmp/dfs/, such as mkdir, touch and other commands will be directly reflected on hdfs)

$ HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh dfs: // localhost: 8020/tmp/dfs

Test ls-l/tmp/dfs. Check the directory in hdfs. If you touch a file to/tmp/dfs, it will be displayed in hdfs.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.