How to copy Local files to HDFs and show progress with Java programs

Source: Internet
Author: User

Put the program into a jar pack and put it on Linux.

Go to the directory to execute the command Hadoop jar Mapreducer.jar/home/clq/export/java/count.jar hdfs://ubuntu:9000/out06/count/

The above one is a local file, one is the upload HDFs location

After success appears: Print out the characters you want to print.

Package Com.clq.hdfs;  
Import Java.io.BufferedInputStream;  
Import Java.io.FileInputStream;  
Import java.io.IOException;  
Import Java.io.InputStream;  
Import Java.net.URI;  
Import org.apache.hadoop.conf.Configuration;  
Import Org.apache.hadoop.fs.FSDataOutputStream;  
Import Org.apache.hadoop.fs.FileSystem;  
Import Org.apache.hadoop.fs.Path;  
Import Org.apache.hadoop.io.IOUtils;  
       
Import org.apache.hadoop.util.Progressable; public class Filecopywithprogress {//********************************//Copy a local file to the HDFs//************************  
        public static void Main (string[] args) throws IOException {String localsrc = args[0];  
        String DST = args[1];  
        InputStream in = new Bufferedinputstream (new FileInputStream (LOCALSRC));  
        Configuration conf = new Configuration ();  
        FileSystem fs = Filesystem.get (Uri.create (DST), conf); Fsdataoutputstream out = fs.create (new Path (DST), New progressable () {@Override public void progress () {System.out.print (".");  
        }  
        });  
    Ioutils.copybytes (in, Out, Conf, true); }  
       
}

See more highlights of this column: http://www.bianceng.cnhttp://www.bianceng.cn/Programming/Java/

An exception may occur:

Exception in thread "main" Org.apache.hadoop.ipc.RemoteException:java.io.IOException:Cannot create/out06; already exists as a directory

At Org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal (fsnamesystem.java:1569)

At Org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile (fsnamesystem.java:1527)

At Org.apache.hadoop.hdfs.server.namenode.NameNode.create (namenode.java:710)

At Org.apache.hadoop.hdfs.server.namenode.NameNode.create (namenode.java:689)

At Sun.reflect.GeneratedMethodAccessor7.invoke (Unknown Source)

At Sun.reflect.DelegatingMethodAccessorImpl.invoke (delegatingmethodaccessorimpl.java:43)

At Java.lang.reflect.Method.invoke (method.java:606)

At Org.apache.hadoop.ipc.rpc$server.call (rpc.java:587)

At Org.apache.hadoop.ipc.server$handler$1.run (server.java:1432)

At Org.apache.hadoop.ipc.server$handler$1.run (server.java:1428)

At Java.security.AccessController.doPrivileged (Native method)

At Javax.security.auth.Subject.doAs (subject.java:415)

Show you this path already exists on the HDFs, change one can.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.