Hadoop accesses HDFs via the C API

Source: Internet
Author: User

When accessing HDFs through the C API of Hadoop, there are many problems with compiling and running, so here's a summary:

System: ubuntu11.04,hadoop-0.20.203.0

The sample code is provided in the official documentation to:

#include "hdfs.h" 

int main (int argc, char **argv) {

    Hdfsfs fs = Hdfsconnect ("default", 0);
    Const char* Writepath = "/tmp/testfile.txt";
    Hdfsfile WriteFile = Hdfsopenfile (FS, Writepath, o_wronly| O_creat, 0, 0, 0);
    if (!writefile) {
          fprintf (stderr, "Failed to open%s for writing!\n", Writepath);
          Exit ( -1);
    }
    char* buffer = "Hello, world!";
    Tsize num_written_bytes = Hdfswrite (FS, WriteFile, (void*) buffer, strlen (buffer) +1);
    if (Hdfsflush (FS, WriteFile)) {
           fprintf (stderr, "Failed to ' flush '%s\n", writepath); 
          Exit ( -1);
    }
   Hdfsclosefile (FS, WriteFile);
}

Compiling: The official website describes this

The Makefile for hdfs_test.c in the Libhdfs source directory (${hadoop_home}/src/c++/libhdfs/makefile) or something Li Ke
GCC Above_sample.c-i${hadoop_home}/src/c++/libhdfs-l${hadoop_home}/libhdfs-lhdfs-o above_sample

But I have tried two methods, no, the back is found to be less:

LIB =-l$ (hadoop_install)/c++/linux-i386-32/lib/
libjvm=/usr/lib/jvm/java-6-openjdk/jre/lib/i386/client/ Libjvm.so

So complete to Makefile is:

hadoop_install=/home/fzuir/hadoop-0.20.203.0
platform=linux-i386-32
java_home=/usr/lib/jvm/ java-6-openjdk/
cppflags=-i$ (hadoop_install)/src/c++/libhdfs
LIB =-l$ (hadoop_install)/c++/linux-i386-32 /lib/
libjvm=/usr/lib/jvm/java-6-openjdk/jre/lib/i386/client/libjvm.so
ldflags =-lhdfs

TestHdfs: TESTHDFS.C
	gcc testhdfs.c  $ (cppflags) $ (LIB) $ (ldflags) $ (LIBJVM)-O testhdfs clean

:
	rm Testhdfs

OK, the compilation passes, but the following error message appears when running:

1.

./testhdfs:error while loading shared libraries:xxx.so.0:cannot open Shared object file:no such file or directory

Workaround: Add the xxx.so.0 directory to the/etc/ld.so.conf, then/sbin/ldconfig–v it.

2.

Exception in thread "main" java.lang.noclassdeffounderror:org/apache/commons/configuration/configuration

...

Call to Org.apache.hadoop.fs.filesystem::get (URI, Configuration) failed!
Exception in thread "main" java.lang.NullPointerException
Call to get Configuration object from filesystem failed!

Workaround, modify the/etc/profile, and add the corresponding classpath:

hadoop_home=/home/fzuir/hadoop-0.20.203.0
Export path= $HADOOP _home/bin: $PATH
export classpath=.: $HADOOP _ Home/lib/commons-lang-2.4.jar: $HADOOP _home/hadoop-cor
e-1.0.1.jar: $HADOOP _home/lib/ Commons-logging-api-1.0.4.jar: $HADOOP _home/lib/comm
Ons-configuration-1.6.jar: $JAVA _home/lib: $JRE _home/lib : $HADOOP _home/contrib/stre
Aming/hadoop-streaming-1.0.1.jar: $CLASSPATH

Finally, congratulations, the problem has been solved.




Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.