First, ready to run the required jar package
1) Avro-1.7.4.jar
2) Commons-cli-1.2.jar
3) Commons-codec-1.4.jar
4) Commons-collections-3.2.1.jar
5) Commons-compress-1.4.1.jar
6) Commons-configuration-1.6.jar
7) Commons-io-2.4.jar
8) Commons-lang-2.6.jar
9) Commons-logging-1.2.jar
) Commons-math3-3.1.1.jar
One) Commons-net-3.1.jar
Curator-client-2.7.1.jar)
Curator-recipes-2.7.1.jar)
Gson-2.2.4.jar)
Guava-20.0.jar)
Hadoop-annotations-2.8.0.jar)
Hadoop-auth-2.8.0.jar)
) Hadoop-common-2.8.0.jar
Hadoop-hdfs-2.8.0.jar)
Hadoop-hdfs-client-2.8.0.jar)
) Htrace-core4-4.0.1-incubating.jar
Httpclient-4.5.2.jar)
Jackson-core-asl-1.9.13.jar)
) Jackson-mapper-asl-1.9.13.jar
Jersey-core-1.9.jar)
Jersey-json-1.9.jar)
Jersey-server-1.9.jar)
Jets3t-0.9.0.jar)
) Jetty-6.1.26.jar
Jetty-sslengine-6.1.26.jar)
Jetty-util-6.1.26.jar)
Jsch-0.1.51.jar)
) Jsr305-3.0.0.jar
Log4j-1.2.17.jar)
Protobuf-java-2.5.0.jar)
Servlet-api-2.5.jar)
Notoginseng) Slf4j-api-1.7.21.jar
Xmlenc-0.52.jar)
Second, copy the cluster file to the Src/main/resources directory of the Project
Core-site.xml
Hdfs-site.xml
Third, write the code
Public Static voidCreateFile (String DST,byte[] contents)throwsIOException {String uri= "Hdfs://master:9000/"; Configuration Config=NewConfiguration (); FileSystem FS=Filesystem.get (Uri.create (URI), config); //list all files and directories in the/user/fkong/directory on HDFsfilestatus[] statuses = Fs.liststatus (NewPath ("/test/")); for(Filestatus status:statuses) {System.out.println ("==================:" +Status+ ":================="); } //create a file in the/user/fkong directory of HDFs and write a line of textFsdataoutputstream OS = fs.create (NewPath ("/test/hadoop4.log")); Os.write ("My first Hadoop file! Not bad! ". GetBytes ()); Os.flush (); Os.close (); //Displays the contents of the specified file in HDFsInputStream is = Fs.open (NewPath ("/test/hadoop4.log")); Ioutils.copybytes (IS, System.out,1024,true); } Public Static voidMain (string[] args)throwsIOException {createFile ("/user/hadoop/test/", "Hello World". GetBytes ()); }
Hadoop Combat-developing Hadoop API programs with Eclipse (iv)