Hadoop uses the filesystem API to perform Hadoop file read and write operations

Source: Internet
Author: User
Tags file system

Because HDFs is different from a common file system, Hadoop provides a powerful filesystem API to manipulate HDFs.

The core classes are Fsdatainputstream and Fsdataoutputstream.

Read operation:

We use Fsdatainputstream to read the specified file in HDFs (the first experiment), and we also demonstrate the ability to locate the file location of the class, and then start reading the file from the specified location (the second experiment).

The code is as follows:

* */package COM.CHARLES.HADOOP.FS; 
     
Import Java.net.URI; 
Import org.apache.hadoop.conf.Configuration; 
Import Org.apache.hadoop.fs.FSDataInputStream; 
Import Org.apache.hadoop.fs.FileSystem; 
Import Org.apache.hadoop.fs.Path; 
     
Import Org.apache.hadoop.io.IOUtils; /** * * Description: View the files in the Hadoop file system and use the Fsdatainputstream * Fsdatainputstream in the Hadoop filesystem interface to also have the ability to stream position, from any file  Position start Read * * @author Charles.wang * @created May, 12:28:49 PM * */public class Readfromhadoopfilesystem {/** * @param args */public static void main (string[] args) throws exception{//To  The Do auto-generated method stub//The first parameter passed in is the URI of a file in the Hadoop file system, preceded by a HDFS://IP theme String 
        URI = Args[0]; 
        Read the configuration of the Hadoop file system Configuration conf = new Configuration (); 
             
        Conf.set ("Hadoop.job.ugi", "Hadoop-user,hadoop-user"); FileSystem is the core class of user action HDFs, which obtains URI correspondenceHDFs File system FileSystem FS = Filesystem.get (Uri.create (URI), conf); 
        Fsdatainputstream in = null; 
            try{//Experiment one: Output all file contents System.out.println ("experiment one: Output all document Contents"); 
            Let filesystem open a URI corresponding to the Fsdatainputstream file input stream, read this file in = Fs.open (new Path (URI));   
            Use the Ioutils tool method of Hadoop to copy the specified byte of this file to the standard output stream ioutils.copybytes (in, System.out,50,false); 
                 
                 
            System.out.println (); Experiment Two: Display Fsdatainputstream file input flow localization ability, use seek to locate System.out.println ("Experiment two: Show Fsdatainputstream file input flow localization ability, use seek to carry out 
                 
            Positioning "); 
                If we want to file output 3 times//First input all content, second input starting from the 20th character content, the 3rd output starting from the 40th character of content for (int i=1;i<=3;i++) { 
                In.seek (0+20* (i-1)); 
                SYSTEM.OUT.PRINTLN ("Flow location first" +i+ "Times:");  
            Ioutils.copybytes (in, System.out,4096,false); }}finally{Ioutils.cloSestream (in); } 
     
    } 
     
}

Our incoming command-line argument is the URI of a file in the HDFs file system that we want to read:

Hdfs://192.168.129.35:9000/user/hadoop-user/textfile.txt

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.