Hadoop HDFs Programming API starter Series upload files from local to HDFs (one)

Source: Internet
Author: User

Not much to say, directly on the code.

Code

Package zhouls.bigdata.myWholeHadoop.HDFS.hdfs5;

Import java.io.IOException;

Import Java.net.URI;
Import java.net.URISyntaxException;

Import org.apache.hadoop.conf.Configuration;
Import Org.apache.hadoop.fs.FileSystem;
Import Org.apache.hadoop.fs.Path;

/**
*
* @author
* @function Copying from the Local file system to HDFS
*
*/
public class Copyinglocalfiletohdfs
{
/**
* @function Main () method
* @param args
* @throws IOException
* @throws URISyntaxException
*/
public static void Main (string[] args) throws Ioexception,urisyntaxexception
{
Local file path
String Source = "D://data/weibo.txt";
String Source = "./data/weibo.txt";
HDFs file path
String dest = "hdfs://hadoopmaster:9000/middle/weibo/";
Copyfromlocal (source, dest);
}

/**
* Upload @function local file to HDFS
* @param source Original file path
* @param dest Destination file path
* @throws IOException
* @throws URISyntaxException
*/
public static void Copyfromlocal (string source, String dest)
Throws IOException, URISyntaxException {
Read the configuration of the Hadoop file system
Configuration conf = new configuration ();
Uri uri = new Uri ("hdfs://hadoopmaster:9000");
FileSystem is the core class for user-operated HDFs, which obtains the HDFs file system for the URI
FileSystem FileSystem = Filesystem.get (URI, conf);
Source file path
Path Srcpath = new path (source);
Destination Path
Path Dstpath = new Path (dest);
See if the destination path exists
if (! ( Filesystem.exists (Dstpath))) {
If the path does not exist, create it immediately
Filesystem.mkdirs (Dstpath);
}
Get local file name
String filename = source.substring (source.lastindexof ('/') + 1,source.length ());
try {
Uploading local files to HDFs
Filesystem.copyfromlocalfile (Srcpath, Dstpath);
System.out.println ("File" + filename + "copied to" + dest);
} catch (Exception e) {
System.err.println ("Exception caught!:" + e);
System.exit (1);
} finally {
Filesystem.close ();
}
}

}

Hadoop HDFs Programming API starter Series upload files from local to HDFs (one)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.