HDFs Java Client Writing (Java code implements operations on HDFs) __java

Source: Internet
Author: User
The source code is as follows:
Package Com.sfd.hdfs;
Import Java.io.FileInputStream;

Import java.io.IOException;
Import Org.apache.commons.compress.utils.IOUtils;
Import org.apache.hadoop.conf.Configuration;
Import Org.apache.hadoop.fs.FSDataOutputStream;
Import Org.apache.hadoop.fs.FileStatus;
Import Org.apache.hadoop.fs.FileSystem;
Import Org.apache.hadoop.fs.LocatedFileStatus;
Import Org.apache.hadoop.fs.Path;
Import Org.apache.hadoop.fs.RemoteIterator;
Import Org.junit.BeforeClass;

Import Org.junit.Test;
    public class Hdfsutils {private static filesystem FS; /** * This method is initialized to FS before each test * @throws Exception/@BeforeClass public static void Init () throws Exce
        ption{Configuration conf=new Configuration ();
            /** to set up a profile (Hdfs-site.xml,core-site.xml file), or you can use the Core-site.xml,hdfs-site.xml file in the working directory
            Add to the SRC directory, if you do not fill in the program run will error, because the program's default file management system is a local file management system rather than HDFS.
          The following setting is equivalent to telling the program that our configuration is acting on the HDFS (Distributed File System),  This setting will overwrite the configuration of the corresponding properties in the Hdfs-site.xml under the SCR directory the FS that is generated through this configuration actually inherits the Distributedfilesystem subclasses of the abstract class filesystem.
        * * Conf.set ("Fs.defaultfs", "hdfs://localhost:9000"); 
    FS = Filesystem.get (conf);

        /** * Upload file to HDFs (not encapsulated) * @throws Exception * * @Test public void upload () throws exception{
        Path path=new path ("Hdfs://localhost:9000/sfd/sfd3.txt");
        From the file system to the output stream Fsdataoutputstream os=fs.create (path);
        Get the client local input stream FileInputStream in=new fileinputstream ("/home/sfd/soft/download/sfd1.txt");

    The data from the input stream above is copied into the output stream through the tool class, thus implementing the process of uploading the file from the local to the HDFs ioutils.copy (in, OS);
        /** * Use the framework to implement the upload of local files * @throws Exception * * @Test public void upload2 () throws exception{ Fs.copyfromlocalfile (New Path ("/home/sfd/soft/download/sfd1.txt"), New Path ("Hdfs://localhost:9000/aa/bb/sfd3.txt
    "));
 /** * Use the framework to download files from HDFs to local * @throws Exception * * * @Test   public void Download () throws exception{Fs.copytolocalfile (new Path ("Hdfs://localhost:9000/aa/bb/sfd3.txt"), n

    EW Path ("/home/sfd/soft/download/sfd4.txt")); /** * Queries All files in the file directory (including files under subfolders, excluding folders) * @throws Exception */@Test public void ListFile () thro WS exception{//Gets an iterator for the state of the file in the HDFs specified (first parameter) directory, and the second parameter indicates whether to iterate through the files under the folder under the directory remoteiterator< Locatedfilestatu
        S> Files=fs.listfiles (New Path ("/"), false);
            The traversal iterator gets the filename while (Files.hasnext ()) {//respectively gets the status of each file Locatedfilestatus File=files.next ();
            Get the path of the file from the state, and then get the filename path Path=file.getpath ();
            String Filename=path.getname ();

        Output filename System.out.println (filename); /** * Query the files and folders in the specified directory * @throws Exception * * @Test public void Listfindanddir () throws E xception{//Get file status filestatus[] Liststatus = fs.liststatus (New Path ("/"));
                Traversal for (Filestatus filestatus:liststatus) {Path PATH = Filestatus.getpath ();
                String name = Path.getname ();
            SYSTEM.OUT.PRINTLN (name);
            /** * Create folder in HDFs * @throws Exception/@Test public void MakeDir () throws exception{
    If the BB folder does not create the BB folder First, if the AA has not yet created the CC folder Fs.mkdirs ("/aa/bb/cc");
        /** * Use frame to delete files or non-empty folders * @throws Exception * * @Test public void Remove () throws exception{
    Delete the directory, the second argument is whether to recursively delete, if the first parameter represents a Non-empty folder, it will error Fs.delete (new Path ("/aa/bb"), true);
        /** * Use framework to implement renaming and moving of files * @throws Exception * * @Test public void ReName () throws exception{ 1. When the files pointed to by the two parameters are renamed//2 when they are in a folder. When the files pointed to by the two parameters are in a different folder, the file is moved Fs.rename (new Path ("/sfd/sfd1.txt")
    , New Path ("/sfd/sfd4.txt")); }
}

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.