Objective:
Learn Java command operations using HDFS
Environment:
hadoop2.6.4 Pseudo-distributed
Win7 + eclipse Version:luna Service Release 1 (4.4.1)
1. New Project
1.1 New Java Project HADOOP_ Pseudo-distributed
1.2 Importing a jar package from Hadoop 2.6.4
Item Name right-click Properties->java build path, add library--user library, create user library hadoop2.6
Click the user library hadoop2.6-> add an external jar in turn hadoop2.6.4 all the jar packages under the extracted directory, and the corresponding sub-directory Lib Jar package import
Share\hadoop\common
Share\hadoop\hdfs
Share\hadoop\yarn
Share\hadoop\mapredude
2. Create a new class and write the following code
By streaming, download a file from HDFs to local Linux,
1 /**2 * Function: Download hdfs://ssmaster:9000/data/paper.txt to Linux under/home/hadoop/paper.txt3 * Calling Method: Hadoop jar package name. Jar4 */5 PackageHadoop.hdfs;6 7 Importjava.io.FileNotFoundException;8 ImportJava.io.FileOutputStream;9 Importjava.io.IOException;Ten One Importorg.apache.commons.compress.utils.IOUtils; A Importorg.apache.hadoop.conf.Configuration; - ImportOrg.apache.hadoop.fs.FSDataInputStream; - ImportOrg.apache.hadoop.fs.FileSystem; the ImportOrg.apache.hadoop.fs.Path; - - Public classTest_fun { - + Public Static voidMain (string[] args) { - +Configuration conf =NewConfiguration (); A atFileSystem fs =NULL; -Path src =NULL; -Fsdatainputstream in =NULL; -FileOutputStream out =NULL; - -src =NewPath ("Hdfs://ssmaster:9000/data/paper.txt" ); in - Try { to +FS =filesystem.get (conf); -in =Fs.open (SRC); the *}Catch(IOException e) { $ e.printstacktrace ();Panax Notoginseng } - the Try { +out =NewFileOutputStream ("/home/hadoop/paper.txt"); A}Catch(FileNotFoundException e) { the e.printstacktrace (); + } - $ Try { $ ioutils.copy (in, out); -}Catch(IOException e) { - e.printstacktrace (); the } - Wuyi } the}
View Code
Note:
The previous path in "Hdfs://ssmaster:9000/data/paper.txt" is taken from the Hadoop-installed configuration file core-site.xml.
Parameters <name>fs.defaultFS</name><value>hdfs://ssmaster:9000</value>
3. Export the jar package, execute
Right-click Project name---Export->java/jar file, specify JAR path name---Specify main class, complete
Upload to Linux server, execute program, view results
[Email protected]:~/java_program$ Hadoop jar Hadoop_hdfs_download.jar[email protected]:ls Desktop Downloads Hadoop-2.6. 4. tar. GZ java_program paper.txt Pictures spark-2.0. 1-bin-hadoop2. 6 . tgz Videosdocuments examples.desktop hdfs-site.xml Music park-2.0. 1-bin-hadoop public Templates
Summarize:
Step troublesome wordy.
Other options that need to be studied
Programs in Eclipse under Windows run, directly manipulate Hadoop
Install Eclipse in Linux and run the program
[0007] Example of an Eclipse development HDFs program under Windows