Remote Debugging HBase under Eclipse

Source: Internet
Author: User
Keywords Eclipse hbase remote debugging
Tags added apache array class client configuration create data

1. Prepare the work, after installing the HABSE, executes the hbase shell

Create ' table name ', ' column name 1 ', ' Column Name 2 ', ' Column name n '

Create ' table name ', ' column family name '

Columns can be added dynamically in HBase, and only one column family is required.

create ' Test_lcc_person ', ' Lcc_liezu '

Then add some data key the same is a piece of data, a total of 6 data
Put ' table name ', ' Rowkey (equivalent to the ID of relational data, must be unique) ', ' column family name: Column name: ', ' value '

Put ' Test_lcc_person ', ' 1 ', ' lcc_liezu:name: ', ' Yakakawa 1 ' put ' Test_lcc_person ', ' 1 ', ' lcc_liezu:sex: ', ' Man put ' test_lcc_ Person ', ' 1 ', ' lcc_liezu:age: ', ' put ' Test_lcc_person ', ' 2 ', ' Lcc_liezu:name: ', ' Yakakawa 2 ' put ' Test_lcc_person ', ' 2 ', ' Lcc_liezu:sex: ', ' man ' put ' Test_lcc_person ', ' 2 ', ' Lcc_liezu:age: ', ' put ' Test_lcc_person ', ' 3 ', ' Lcc_liezu:name: ', ' Yakakawa 3 ' put ' Test_lcc_person ', ' 3 ', ' Lcc_liezu:sex: ', ' man ' put ' Test_lcc_person ', ' 3 ', ' lcc_liezu:age: ', ' put ' test_lcc_ Person ', ' 4 ', ' Lcc_liezu:name: ', ' Yakakawa 4 ' put ' Test_lcc_person ', ' 4 ', ' Lcc_liezu:sex: ', ' man ' put ' Test_lcc_person ', ' 4 ', ' LCC _liezu:age: ', ' put ' Test_lcc_person ', ' 5 ', ' Lcc_liezu:name: ', ' Yakakawa 5 ' put ' Test_lcc_person ', ' 5 ', ' Lcc_liezu:sex: ', ' Male ' put ' test_lcc_person ', ' 5 ', ' Lcc_liezu:age: ', ' put ' on ' Test_lcc_person ', ' 6 ', ' Lcc_liezu:name: ', ' Yakakawa 6 ' put ' test_lcc_ Person ', ' 6 ', ' Lcc_liezu:sex: ', ' man ' put ' Test_lcc_person ', ' 6 ', ' Lcc_liezu:age: ', ' 12 '

The data in HBase is as follows


3. Writing business code, the main business code is to implement reading HBase data in the console
Build the MAVEN project and put these classes in

The red line is the key.

Package com.kensure.mr; Import java.io.IOException; Import org.apache.hadoop.conf.Configuration; Import Org.apache.hadoop.fs.Path; Import org.apache.hadoop.hbase.HBaseConfiguration; Import Org.apache.hadoop.hbase.client.Result; Import Org.apache.hadoop.hbase.client.Scan; Import org.apache.hadoop.hbase.io.ImmutableBytesWritable; Import Org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil; Import Org.apache.hadoop.io.Text; Import Org.apache.hadoop.mapreduce.Job; Import Org.apache.hadoop.mapreduce.lib.chain.ChainMapper; Import Org.apache.hadoop.mapreduce.lib.chain.ChainReducer; Import Org.apache.hadoop.mapreduce.lib.input.FileInputFormat; Import Org.apache.hadoop.mapreduce.lib.input.TextInputFormat; Import Org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; Import Org.apache.hadoop.mapreduce.lib.output.TextOutputFormat; Import Com.kensure.mr.mapper.LoadHadoopMapper; Import Com.kensure.mr.mapper.LoadHbaseMapper; Import Com.kensure.mr.reducer.SaveHadoopReduce; Import Com.kensure.mr.reDucer. Savehbasereduce; public class Calculatejob {//private static final String Input_path = "/data/mr/out/test";//private static final string Input_path = "/data/mr/out/test_j"; public void Run (string] args) throws IOException, Interruptedexception, classnotfoundexception {revisit revisit = Hbaseconfiguration.create (); Configuration.set ("Hbase.zookeeper.property.clientPort", "2181"); Configuration.set ("Hbase.zookeeper.quorum", "192.168.10.82"); /** * Job 1//Job job_1 = job.getinstance (Revisit, "MyTestJob1"); Job_1.setjarbyclass (Calculatejob.class); Scan Scan = new Scan (); Scan.setcaching (1024); Scan.setcacheblocks (FALSE); TEST_LGGJ_ZWF//test_lggj_new_test tablemapreduceutil.inittablemapperjob (args[1), scan, LoadHbaseMapper.class, Text.class, Text.class, job_1); Log.info ("?????????" +ARGS[1]); System.out.println ("================1=======?????????" +ARGS[1]); Chainmapper.addmapper (job_1, Loadhbasemapper.class, Immutablebyteswritable.class, Result.class, TExt.class, Text.class, revisit); Chainreducer.setreducer (job_1, Savehadoopreduce.class, Text.class, Text.class, Text.class, Text.class, revisit ); Job_1.setoutputformatclass (Textoutputformat.class); Job_1.setmapoutputkeyclass (Text.class); Job_1.setmapoutputvalueclass (Text.class); Job_1.setoutputkeyclass (Text.class); Job_1.setoutputvalueclass (Text.class); Fileoutputformat.setoutputpath (job_1, New Path (args[0)); SYSTEM.OUT.PRINTLN ("================2=======????? Job?????? " +args[0]); /** * Job 2/Job job_2 = job.getinstance (Revisit, "MyTestJob2"); Job_2.setjarbyclass (Calculatejob.class); Job_2.setmapperclass (Loadhadoopmapper.class); Job_2.setinputformatclass (Textinputformat.class); Fileinputformat.addinputpath (job_2, New Path (args[0)); Job_2.setmapoutputkeyclass (Text.class); Job_2.setmapoutputvalueclass (Text.class); TEST_MR_ZWF//test_lggj_new_test_res1 tablemapreduceutil.inittablereducerjob (args[2],savehbasereduce.class, Job _2); if (Job_1.waitforcompletion (True)) {JOb_2.waitforcompletion (TRUE); System.out.println ("================3=======??") 2?? Job?????? "); public static void Main (String args]) throws IOException, Interruptedexception, classnotfoundexception { System.setproperty ("Hadoop.home.dir", "e:\\02-hadoop\\hadoop-2.7.3\\"); System.setproperty ("Hadoop_user_name", "root"); String] array=new string[3]; Array[0] = "hdfs://bigdata01.hzjs.co/lcc/y"; ARRAY[1] = "Test_lcc_person"; ARRAY[2] = "Test_lcc_person_savea"; System.out.println ("================4=======???"); Calculatejob mychainmapper = new Calculatejob (); Mychainmapper.run (array); System.out.println ("================5=======????"); } package Com.kensure.mr.common; Import java.text.ParseException; Import Java.text.SimpleDateFormat; Import Java.util.Date; public class Common {public static Boolean isNUll (String str) {if (str = NULL | | str.length () <= 0) {return false; True; public static Boolean isDate (String str) {Boolean flag = false; SimpleDateFormat DateFormat = new SimpleDateFormat ("Yyyy-mm-dd\\shh:mm:ss"); try {date = Dateformat.parse (str); flag = true;} catch (ParseException e) {e.printstacktrace ();} return flag; } package Com.kensure.mr.mapper; Import java.io.IOException; Import org.apache.hadoop.io.LongWritable; Import Org.apache.hadoop.io.Text; Import Org.apache.hadoop.mapreduce.Mapper; Import Com.esotericsoftware.minlog.Log; public class Loadhadoopmapper extends Mapper {@Override protected void map (longwritable key, Text value, context) Throws IOException, interruptedexception {System.out.println ("================5=======loadhadoopmapper==k=" +key+ " Value= "+value); String] v = value.tostring (). Split ("T"); System.out.println ("================6=======loadhadoopmapper==v[0]=" +v+ "v[1]=" +v[1)); Context.write (new text (v[0)), new text (v[1]); } package Com.kensure.mr.mapper; Import java.io.IOException; Import Org.apache.hadoop.hbase.client.Result; Import org.apache.hadoop.hbase.io.ImmutableBytesWritable; Import org.aPache.hadoop.hbase.mapreduce.TableMapper; Import org.apache.hadoop.hbase.util.Bytes; Import Org.apache.hadoop.io.Text; Import Com.kensure.mr.common.Common; public class Loadhbasemapper extends Tablemapper {public void map immutablebyteswritable row Context) throws IOException, interruptedexception {System.out.println ("================7=======loadhbasemapper== Row= "+row); StringBuffer str = new StringBuffer (); String Rowkey = bytes.tostring (Result.getrow ()); String name = bytes.tostring (Result.getvalue (Bytes.tobytes ("Lcc_liezu"), Bytes.tobytes ("name")); String sex = bytes.tostring (Result.getvalue (Bytes.tobytes ("Lcc_liezu"), Bytes.tobytes ("Sex")); String age = bytes.tostring (Result.getvalue (Bytes.tobytes ("Lcc_liezu"), Bytes.tobytes ("Age")); System.out.println ("================8========rowkey==" +rowkey+ "\n\r"); System.out.println ("================8==========" +name+ "T" +sex+ "\ T" +age+ "\ T" + "\n\r"); } package Com.kensure.mr.reducer; Import java.io.IOException; ImPort Java.util.ArrayList; Import Java.util.Iterator; Import java.util.List; Import Java.util.Map; Import Java.util.TreeMap; Import Org.apache.hadoop.io.Text; Import Org.apache.hadoop.mapreduce.Reducer; public class Savehadoopreduce extends reducer {public void reduce (Text key, iterable it, context) throws IOException, interruptedexception {System.out.println ("================12=========key=" +key+ "\n\r"); for (text text : its) {}//iterator it = Its.iterator (); Map maps = new TreeMap (); while (It.hasnext ()) {//String map = It.next (). toString ();//string] str = map.tostring (). Split ("\\|");//Maps.put ( STR[2], map); ////List List = new ArrayList (); List.addall (Maps.keyset ()); if (List.size () >= 2) {//for (int x = 0; x < list.size ()/x + +) {//for (int y = 0; y < list.size (); y++) {/ /if (!list.get (x). Equalsignorecase (List.get (y))) {//String map1 = Maps.get (List.get (x));//String map2 = Maps.get ( List.get (y)); String Newkey = list.get (x) + "-" + list.get (y); StringBuffer str = new StringBuffer (""); Str.append (""). Append (Map1). Append (","). Append (MAP2). Append (""); Context.write (new text (Newkey), new text (Str.tostring ())); }//}///} package com.kensure.mr.reducer; Import java.io.IOException; Import Java.util.HashMap; Import Java.util.Iterator; Import Java.util.Map; Import Org.apache.hadoop.hbase.client.Put; Import org.apache.hadoop.hbase.io.ImmutableBytesWritable; Import Org.apache.hadoop.hbase.mapreduce.TableReducer; Import org.apache.hadoop.hbase.util.Bytes; Import Org.apache.hadoop.io.Text; public class Savehbasereduce extends Tablereducer {public void reduce (Text key, iterable it, context) throws IOException, interruptedexception {System.out.println ("================12=========key=" +key+ "\n\r");/** * Pitchfork Press 叆 fermium 刱 ey 褰 ㈠ Lyon * ZJHM-ZJHM * * put on = new put (Bytes.tobytes (key.tostring ())); int SANMEFH = 0; StringBuffer str = new StringBuffer (); Iterator it = Its.iterator (); MAP tmp= new HashMap (); Tmp.put ("Samefh_", 0); while (It.hasnext ()) {text text = It.next (); System.out.println ("================13=========text="); String] Persones = Text.tostring (). Split (","); /** * Cha Press 嚭 fermium 剉 alue 褰 ㈠ Lyon * ROWKEY|LGBM|ZJHM|NAME|FH|KSSJ|JSSJ|XZQH * * string] person_1 = Persones[0].split ("\\|"); String] person_2 = Persones[1].split ("\\|"); System.out.println ("================13=========person_1=" +person_1[0)); System.out.println ("================13=========person_1=" +person_2[0)); Accommodating 屾 埧 闂? if (Person_1[4].equalsignorecase (Person_2[4])) {SANMEFH + +;} System.out.println ("================13=========sanmefh=" +SANMEFH); Aachen 屾 斂 鍖 Coax 垝 String XZQH = person_1[7]; System.out.println ("================13=========xzqh=" +XZQH); if (Tmp.containskey (XZQH)) {tmp.put (XZQH, Tmp.get (XZQH) +1);} else {tmp.put (xzqh,1);}//OptIn drama Xinjiang Ying Hua 偍 if (str.length () > 0) {s Tr.append (","); } str.append ("["); Str.append (Text.tostring ()); Str.append ("]"); Boolean flag = false; System.out.println ("================14=========sanmefh= "+ (SANMEFH >= 1)); Accommodating 屾 埧 闂? 1 Rao if (Sanmefh >= 1) {flag = Flag | true; Put.add (Bytes.tobytes ("base"), Bytes.tobytes ("Samefh"), Bytes.tobytes ("" + SANMEFH)); For (String Tmp_key:tmp.keySet ()) {//accommodating accommodating  鏀 down 尯 鍒? 3 Rao (Tmp.get) Tmp_key 3) {>= = flag | | true; flag (Put.add Ytes ("base"), Bytes.tobytes ("Samexzqh"), Bytes.tobytes (Tmp.get (Tmp_key) + "")); }//Juan 嶅 悓 Aachen 屾 斂 鍖 Coax 垝 Juan ゆif (Tmp.size () >= 2) {flag = flag | | true; PUT.ADD (Bytes.tobytes ("base"), Bytes.tobytes ("UNSAMEXZQH") , Bytes.tobytes (tmp.size () + "")); } if (true) {Put.add (bytes.tobytes ("base"), Bytes.tobytes ("detail"), Bytes.tobytes (Str.tostring ()); System.out.println ("================15==========" +put); Context.write (null, put); } System.out.println ("================16=========="); } }

4. Download the two files of hadoo2.7.3 Hadoop.dll and Winutils.exe, put Hadoop.dll under the c:\windows/system32 of the local computer, reboot the computer, and put the Winutils.exe file on the server opt/hzjs/hadoop-2.7.3/bin/directory, each cluster of machines should be put, the cluster of Hadoop copy into the local E:\02-hadoop\hadoop-2.7.3\ directory, the inside of the configuration file copy to the project map.

Without doing this hadoop.dll will have this problem

Exception in thread "main" java.lang.unsatisfiedlinkerror:org.apache.hadoop.io.nativeio.nativeio$ WINDOWS.ACCESS0 (ljava/lang/string;i) Z
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.