Hadoop reading Notes (i) Introduction to Hadoop: http://blog.csdn.net/caicongyang/article/details/39898629
Hadoop Reading notes (ii) the shell operation of HDFs :http://blog.csdn.net/caicongyang/article/details/41253927
JAVA URL Operation HDFs
Operatebyurl.java
Package Hdfs;import Java.io.inputstream;import Java.net.url;import org.apache.hadoop.fs.FsUrlStreamHandlerFactory; Import Org.apache.hadoop.io.ioutils;public class Operatebyurl {private static final String PATH = "hdfs:// 192.168.80.100:9000/test.txt ";p ublic static void Main (string[] args) throws Exception {// View File Url.seturlstreamhandlerfactory (new Fsurlstreamhandlerfactory ()); URL url = new URL (PATH); InputStream in = Url.openstream (); Ioutils.copybytes (in, System.out, 1024,true);}}
Hadoop Java API Operation HDFs
Operatebyhadoopapi.java
Package Hdfs;import Java.io.file;import Java.io.fileinputstream;import java.io.fileoutputstream;import Java.net.URI ; Import Org.apache.hadoop.conf.configuration;import Org.apache.hadoop.fs.fsdatainputstream;import Org.apache.hadoop.fs.fsdataoutputstream;import Org.apache.hadoop.fs.filestatus;import Org.apache.hadoop.fs.filesystem;import Org.apache.hadoop.fs.path;import Org.apache.hadoop.io.ioutils;public Class Operatebyhadoopapi {//hadoop HDFs path private static final String path= "hdfs://192.168.80.100:9000/";p rivate static final String dir= "/d1";p rivate static final String file= "/d1/default.cfg";p ublic static void Main (string[] args) throws Excepti on {FileSystem FileSystem = filesystem.get (new URI (PATH), New Configuration ());//Create Folder Filesystem.mkdirs (new Path (DIR)) ;//Upload File//Method one//filesystem.copyfromlocalfile (New path ("F:/hadoopbaiduyundownload/liclog.txt"), New Path (DIR));// Method two Fsdataoutputstream out = Filesystem.create (new Path (FILE)); FileInputStream in = new FileInputStream (New File ("F:/hadOopbaiduyundownload/default.cfg ")); Ioutils.copybytes (in, out, 1024x768, true);//download//Method one generation WARN to be resolved: 14/11/19 21:39:49 WARN Util. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where Applicablefil E file = new file ("F:/hadoopbaiduyundownload/test.txt"); File File2 = new file ("F:/hadoopbaiduyundownload/test2.txt"); Filesystem.copytolocalfile ("New Path" ("hdfs:// 192.168.80.100:9000/test.txt "), New Path (File.getabsolutepath ()));//Method two Fsdatainputstream InputStream = Filesystem.open (New Path ("Hdfs://192.168.80.100:9000/test.txt")); FileOutputStream outputstream = new FileOutputStream (File2.getabsolutepath ()); Ioutils.copybytes (InputStream, OutputStream, 1024x768, true);//Traverse folder filestatus[] Liststatus = filesystem.liststatus (New Path ("/")); for (filestatus Filestatus:liststatus) {System.out.println (Filestatus.isdir ()?) Folder ":" File "+" "+filestatus.getowner () +" "+filestatus.getreplication () +" "+ Filestatus.getpath ()); Delete file/** * @parameter path * @parameter boOlean: Recursively Delete * */filesystem.delete (new Path (DIR), true) if True,path is a folder;}}
Welcome everybody to discuss the study together!
Useful Self-collection!
Record and share, let you and I grow together! Welcome to my other blogs, my blog address: Http://blog.csdn.net/caicongyang
Hadoop reading Notes (iii) Java API operations HDFs