IDEA Create HDFs project JAVA API

Source: Internet
Author: User
Tags file copy

1. Create Quickmaven

1. Write the version number of Hadoop in the properties and map it to dependency by El Expression

2. Write a repostory to load the dependencies into the local repository

This is the page loaded complete

This is the development code

Package com.kevin.hadoop;


Import org.apache.hadoop.conf.Configuration;
Import org.apache.hadoop.fs.*;
Import Org.apache.hadoop.io.IOUtils;
Import org.apache.hadoop.util.Progressable;
Import Org.junit.After;
Import Org.junit.Before;
Import Org.junit.Test;

Import Java.io.BufferedInputStream;
Import Java.io.File;
Import Java.io.FileInputStream;
Import Java.io.InputStream;
Import Java.net.URI;

/**
* Created by Administrator on 2018/7/21 0021.
* Hadoop HDFS Java API operation
*/
public class Hdfsapp {
public static final String Hdfs_path = "hdfs://hadoop000:8020";
File system
FileSystem FileSystem = null;
Configuration class
Configuration configuration = null;

Before for classes before loading
@Test
/**
* Create HDFs system
*/
public void mkdir () throws Exception {
Filesystem.mkdirs (New Path ("/hdfsapi/test"));
}

@Test
public void Create () throws Exception {
Fsdataoutputstream output = filesystem.create (new Path ("/hdfsapi/test/a.txt"));
Output.write ("Hello Hadoop". getBytes ());
Output.flush ();
Output.close ();
}

@Test
/**
* View the contents of the HDFs file
*/
public void Cat () throws Exception {
Fsdatainputstream in = Filesystem.open (new Path ("/hdfsapi/test/a.txt"));
Ioutils.copybytes (in, System.out, 1024);
In.close ();
}

@Before
public void SetUp () throws Exception {
System.out.printf ("Hdfsapp.setup");
Configuration = new configuration ();
Get the file system.
FileSystem = Filesystem.get (new URI (Hdfs_path), configuration, "Hadoop");
}

Close the resource with this
@After
public void TearDown () throws Exception {
Freeing resources
Configuration = null;
FileSystem = null;
System.out.printf ("Hdfsapp.teardown");
}

/**
* Rename File
*/
@Test
public void Rename () throws Exception {
Path OldPath = new Path ("/hdfsapi/test/a.txt");
Path NewPath = new Path ("/hdfsapi/test/b.txt");
Filesystem.rename (OldPath, NewPath);
}

/**
* Upload a file
*
* @throws Exception
*/
@Test
public void Copyfromlocalfile () throws Exception {
Path LocalPath = new Path ("//c//hello.txt");
Path Hdfspath = new Path ("/hdfsapi/test");
Filesystem.copyfromlocalfile (LocalPath, Hdfspath);
}

/**
* Upload a large file
*
* @throws Exception
*/
@Test
public void Copyfromlocalbigfile () throws Exception {
InputStream in = new Bufferedinputstream (
New FileInputStream (
New File ("C:\\mmall-fe\\all.zip"));

Fsdataoutputstream output = filesystem.create (new Path ("/hdfsapi/test/all.zip"),
New Progressable () {
public void Progress () {
System.out.print ("."); With Progress alert information
}
});


Ioutils.copybytes (in, output, 4096);
}
/**
* Download HDFs file
*/
@Test
public void Copytolocalfile () throws exception{
Path LocalPath = new Path ("C:\\test\\b.txt");
Path Hdfspath = new Path ("/hdfsapi/test/hello.txt");
Filesystem.copytolocalfile (false,hdfspath,localpath,true);
}
@Test
public void Listfiles () throws Exception {
filestatus[] filestatuses = filesystem.liststatus (New Path ("/hdfsapi/test"));

for (Filestatus filestatus:filestatuses) {
String isdir = Filestatus.isdirectory ()? "Folder": "File";
Copy
Short replication = Filestatus.getreplication ();
Size
Long len = Filestatus.getlen ();
Path
String path = Filestatus.getpath (). toString ();

System.out.println (isdir + "\ T" + replication + "\ T" + len + "\ T" + path);
}
}
@Test
public void Delete () throws exception{
Filesystem.delete (New Path ("hdfsapi/test/"), true);
}

}


Note that when querying the copy coefficients of a file, if the file copy factor is 1 in pseudo-distributed through the Hadoop sell put, if it is put up through the Java API, the copy factor is 1, because we do not set our own replica coefficients locally, So Hadoop uses its own copy factor.

IDEA Create HDFs project JAVA API

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.