1. Basic operation
(1) Install Eclipse, enter the Eclipse installation directory in the terminal, execute the./eclipse command to open Eclipse, or open eclipse on the desktop by creating the Eclipse launcher;
(2) Establishment of a new project: File-new-java project:example
(3) Add Hadoop-related jar packs to the project:
A. Right-click Project name--bulid path-configure bulid path-libraries-add extemal Jars
B. Open the Share-hadoop folder under the Hadoop installation directory:
Add all jar packages under the common directory and all jar packs under Common/lib
Add all jar packages under the HDFs directory and all jar packs under Hdfs/lib
Add all jar packages under the MapReduce directory and all jar packs under Mapreduce/lib
Add all jar packages under the yarn directory and all jar packs under Yarn/lib
(4) Create a new package, such as com.test.org, in this package to create a fresh Java class to achieve the desired operation.
(5) Export: Right-click the class you want to export-export-java-java File-jar file (fill in the location you want to save and name:/home/linux/load.jar)-next-next (click Browse to add the main class)
(6) Execute Hadoop jar Load.jar (+ path, etc.) (Add main Class)
Hadoop jar Load Com.test.org.load (+ path, etc.) (no main class added)
2. Implementation of the establishment and deletion of folders
(1) Create a folder (Hadoop fs-mkdir/test/example)
Package com.test.data;
Import java.io.IOException;
Import org.apache.hadoop.conf.Configuration;
Import Org.apache.hadoop.fs.FileSystem;
Import Org.apache.hadoop.fs.Path;
public class MkDir {
public static void Main (string[] args) throws ioexception{
String Str1=args[0];
Configuration Configuration =new Configuration ();
FileSystem filesystem=filesystem.get (configuration);
Filesystem.mkdirs (New Path (STR1));
}
}
Delete Folder
Package com.test.data;
Import java.io.IOException;
Import org.apache.hadoop.conf.Configuration;
Import Org.apache.hadoop.fs.FileSystem;
Import Org.apache.hadoop.fs.Path;
public class Delete {
public static void Main (string[] args) throws ioexception{
String Str1=args[0];
Configuration Configuration =new Configuration ();
FileSystem filesystem=filesystem.get (configuration);
Filesystem.mkdirs (New Path (STR1));
}
}
3. Upload data (Hadoop fs-put example.txt/example)
Package com.test.data;
Import Java.io.BufferedInputStream;
Import Java.io.FileInputStream;
Import java.io.IOException;
Import Java.io.InputStream;
Import Java.io.OutputStream;
Import Java.net.URI;
Import org.apache.hadoop.conf.Configuration;
Import Org.apache.hadoop.fs.FileSystem;
Import Org.apache.hadoop.fs.Path;
Import Org.apache.hadoop.io.IOUtils;
public class Load {
public static void Main (string[] args) throws ioexception{
String Str1=args[0];
String Str2=args[1];
InputStream in=new Bufferedinputstream (New FileInputStream (STR1));
Configuration conf=new Configuration ();
FileSystem Fs=filesystem.get (Uri.create (STR2), conf);
OutputStream out=fs.create (New Path (STR2));
Ioutils.copybytes (in, out,4096, true);
}
}
4. Read data (Hadoop Fs-cat/word)
package Com.test.data;
import java.io.IOException;
import Java.io.InputStream;
import org.apache.hadoop.conf.Configuration;
import Org.apache.hadoop.fs.FileSystem;
import Org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
Public class read {
Public static void main (string[] args) throws ioexception{
String Uri=args[0];
Configuration conf=New Configuration ();
FileSystem fs=filesystem.get (CONF);
InputStream in=null;
Try {
In=fs.open (new Path (URI));
Ioutils.copybytes (in, System. out, 4096,false);
finally {
Ioutils.closestream (in);
}
}
}