1. Unzip the HBase installation package
2. Copy the Hadoop installation package from the big Data Environment to Windows (take D:/hadoop as an example)
3. Open the hosts in the C:\WINDOWS\SYSTEM32\DRIVERS\ETC directory and add the following code
127.0.0.1 localhost
192.168.48.134 Master
192.168.48.133 slaver
NOTE: Here you configure a few servers to write a few, here I only configure 192.168.48.134 master and 192.168.48.133 slaver two units
4. Use Eclipse to create a Java project and introduce the first step to extract all the jars inside Lib in the HBase folder
5, the best also under SRC Create a folder, and the cluster environment in the HBase hbase-site.xml into the file folder, and add to the class path (in fact, the new version can omit this step, Because my Hadoop is 2.7.3,hbase with 1.2.4 is a relatively high version of it)
6, in the folder of step two in the Bin folder whether there is a winutils.exe file, if not, just download a place in the directory
7. Sample code:
Importjava.io.IOException;Importorg.apache.hadoop.conf.Configuration;ImportOrg.apache.hadoop.fs.Path;Importorg.apache.hadoop.hbase.HBaseConfiguration;ImportOrg.apache.hadoop.hbase.HColumnDescriptor;Importorg.apache.hadoop.hbase.HConstants;ImportOrg.apache.hadoop.hbase.HTableDescriptor;ImportOrg.apache.hadoop.hbase.TableName;Importorg.apache.hadoop.hbase.client.Admin;Importorg.apache.hadoop.hbase.client.Connection;Importorg.apache.hadoop.hbase.client.ConnectionFactory;ImportOrg.apache.hadoop.hbase.io.compress.Compression.Algorithm; Public classTesthbase {Private Static FinalString table_name = "Test"; Private Static FinalString Cf_default = "Info"; Public Static voidCreateoroverwrite (admin admin, htabledescriptor table)throwsIOException {if(Admin.tableexists (Table.gettablename ())) {admin.disabletable (Table.gettablename ()); Admin.deletetable (Table.gettablename ()); } admin.createtable (table); } Public Static voidCreateschematables (Configuration config)throwsIOException {Try(Connection Connection =connectionfactory.createconnection (config); Admin Admin=connection.getadmin ()) {Htabledescriptor table=NewHtabledescriptor (tablename.valueof (table_name)); Table.addfamily (NewHcolumndescriptor (Cf_default). Setcompressiontype (Algorithm.none)); System.out.print ("Creating table."); Createoroverwrite (admin, table); System.out.println ("Done."); } } Public Static voidMain (string[] args)throwsioexception {Configuration config=hbaseconfiguration.create (); /** The default is to look for files from the root directory of the project's compiled CALSS file, and you can add resource files without specifying the following methods: **/ /*** Config.addresource (input); */ //InputStream input = HBaseOper.class.getResourceAsStream ("/core-site.xml"); //Config.addresource (input); /*** The example program used by the official website is this method * Config.addresource (Path); * Path can be an absolute path, for example: D:\\Program files\\hbase\\conf\\hbase-site.xml*/
This is step two. The path of Hadoop placed on the window from the clustered environment, note that the absolute path System.setproperty ("Hadoop.home.dir", "d:\\users\\zml\\eclipse\\hadoop-2.7.3"); //ADD Any necessary configuration files (Hbase-site.xml, core-site.xml)
These files are copied from the cluster environment.
Config.addresource (NewPath ("D:\\users\\zml\\eclipse\\workspace\\hbasedemo\\hbase_conf\\hbase-site.xml")); Config.addresource (NewPath ("D:\\users\\zml\\eclipse\\workspace\\hbasedemo\\hbase_conf\\core-site.xml")); Createschematables (config); //Modifyschema (config); } }
Windows uses eclipse to connect big Data Environment to HBase