1. List the files in HDFs
1 Packagecom.hdfs.test;2 3 ImportJava.io.BufferedReader;4 Importjava.io.IOException;5 ImportJava.io.InputStream;6 ImportJava.io.InputStreamReader;7 ImportJava.net.URI;8 Importorg.apache.hadoop.conf.Configuration;9 ImportOrg.apache.hadoop.fs.FSDataInputStream;Ten ImportOrg.apache.hadoop.fs.FileStatus; One ImportOrg.apache.hadoop.fs.FileSystem; A ImportOrg.apache.hadoop.fs.FileUtil; - ImportOrg.apache.hadoop.fs.Path; - the Public classAccesshdfs { - Public Static voidMain (string[] args)throwsIOException, classnotfoundexception, interruptedexception{ - - //The following two sentences address specific error issues only when you access HDFs in the server under Windows +System.setproperty ("Hadoop.home.dir", "C:/users/learn/desktop/hadoopfiles"); -System.setproperty ("Hadoop_user_name", "HADOOP"); + AConfiguration conf =NewConfiguration (); at - //conf.set ("Fs.defaultfs", "HDFs://192.168.1.215:9000 "); - //FileSystem FS = filesystem.get (conf); - //if not set in conf, you can write this: -FileSystem FS = Filesystem.get (Uri.create ("hdfs://192.168.1.215:9000"), conf); - in //List Directory -String dir = "/"; tofilestatus[] Filestatus = Fs.liststatus (NewPath (dir)); +path[] List =fileutil.stat2paths (filestatus); - for(Path path:list) { the System.out.println (path.tostring ()); * } $ Fs.close (); Panax Notoginseng } - the}
Note:
1> If there is java.io.IOException:Could not locate executable null\bin\winutils.exe in the Hadoop binaries This error, you can By setting the settings hadoop_home environment variable to resolve, in this case, the use of System.setproperty ("Hadoop.home.dir", "C:/users/learn/desktop/hadoopfiles" In addition to this code, you need to set up a Bin folder in path c:/users/learn/desktop/hadoopfiles and download it from the network Winutils.exe files, and put them in this bin folder.
2> system.setproperty ("Hadoop_user_name", "HADOOP"); This is used to set the user name, because the owner of the Hadoop system in Linux is set as a Hadoop user, This can be explicitly specified when accessed under Windows, and if not specified, the system will present the current user of the Windows system as a user accessing the Hadoop system, with an error similar to Permission denied .
3> when packaged as a jar file, the above two sentences are not required.
4> FileSystem is used to get an instance of the file system, Filestatus contains the metadata in the file
2. Create directory and delete directory
1Configuration conf =NewConfiguration (); 2FileSystem FS = Filesystem.get (Uri.create ("hdfs://192.168.1.215:9000"), conf);3 4 //Create a directory5Fs.mkdirs (NewPath ("TestData"));6Fs.mkdirs (NewPath ("/dataworld"));7 8 //Delete the directory, if it is an empty path, you can ignore the second parameter9Fs.delete (NewPath ("TestData"),true); Ten OneFs.close ();
Note the path of the upload, if the root directory is not specified, it is the/user/user name/directory name
3. Uploading files and reading files
1Configuration conf =NewConfiguration (); 2FileSystem FS = Filesystem.get (Uri.create ("hdfs://192.168.1.215:9000"), Conf); 3 4 //Uploading Files5Path src =NewPath ("C:/users/learn/desktop/hadoopfiles/test.txt");6Path DST =NewPath ("TestData");7 fs.copyfromlocalfile (SRC, DST); 8 9 //Read FileTenString Filedir = "Testdata/test.txt"; OneFsdatainputstream file = Fs.open (NewPath (Filedir)); ABufferedReader in =NULL; - String Line; -in =NewBufferedReader (NewInputStreamReader (file, "UTF-8")); the while(line = In.readline ())! =NULL) { - System.out.println (line); - } - if(In! =NULL){ + in.close (); -}
can also be packaged as a jar file to run
HDFs Development Example