Zhou Hai Han Wen 2013.4.2 can convert the date 08081620: 56: 29 from hbaselog into a timestamp, the operation is as follows: hbase (main): 021: 0importjava. text. simpleDateFormathbase (main): 022: 0importjava. text. parsePositionhbase (main): 023: 0SimpleDateFormat. new (yyMMdd
Zhou haihan/Wen 2013.4.2 can convert the date 08/08/16 20:56:29 from hbase log to a timestamp. The operation is as follows: hbase (main): 021: 0 import java. text. simpleDateFormat hbase (main): 022: 0 import java. text. parsePosition hbase (main): 023: 0 SimpleDateFormat. new ("yy/MM/dd
Zhou haihan/Wen
2013.4.2
You can convert the date '2014/1/16 20:56:29 'from hbase log to a timestamp. The operation is as follows:
hbase(main):021:0> import java.text.SimpleDateFormat hbase(main):022:0> import java.text.ParsePosition hbase(main):023:0> SimpleDateFormat.new("yy/MM/dd HH:mm:ss").parse("08/08/16 20:56:29", ParsePosition.new(0)).getTime() => 1218920189000
It can also be reversed.
hbase(main):021:0> import java.util.Date hbase(main):022:0> Date.new(1218920189000).toString() => "Sat Aug 16 20:56:29 UTC 2008"
$ bin/hbase org.apache.hadoop.hbase.mapreduce.CopyTable [--starttime=X] [--endtime=Y] [--new.name=NEW] [--peer.adr=ADR] tablename
Hbase (main): 001: 0> import java. text. SimpleDateFormat
=> Java: JavaText: SimpleDateFormat
Hbase (main): 002: 0> import java. text. ParsePosition
=> Java: JavaText: ParsePosition
Hbase (main): 004: 0> SimpleDateFormat. new ("yyyy/MM/dd HH: mm: ss "). parse ("2013/03/28 00:00:00", ParsePosition. new (0 )). getTime ()
=> 1364400000000
Hbase (main): 005: 0> SimpleDateFormat. new ("yyyy/MM/dd HH: mm: ss "). parse ("2013/03/28 00:00:10", ParsePosition. new (0 )). getTime ()
=> 1364400010000
[Hbase @ h46 sh] $ hbase org. apache. hadoop. hbase. mapreduce. CopyTable
Usage: CopyTable [general options] [-- starttime = X] [-- endtime = Y] [-- new. name = NEW] [-- peer. adr = ADR]
To export some data to another table myolc, you must first create the table, or specify another cluster:
--peer.adr=server1,server2,server3:2181:/hbase
[Hbase @ h46 hbase] $ hbase org. apache. hadoop. hbase. mapreduce. CopyTable-starttime = 1364400000000-endtime = 1364400010000-new. name = myolc online_count
The export utility can output the table content into a serialized file of HDFS, which is called as follows:
$ bin/hbase org.apache.hadoop.hbase.mapreduce.Export
[
[
[
]]]
Export 2000 seconds of data
[Hbase @ h46 hbase] $ hbase org. apache. hadoop. hbase. mapreduce. Export online_count onlinecount 1 1364400000000 1364402000000
[Hbase @ h46 hbase] $ hadoop fs-ls/user/hbase/onlinecount
Found 3 items
-Rw-r -?? 3 hbase supergroup ????????? 0/user/hbase/onlinecount/_ SUCCESS
Drwxr-xr-x ?? -Hbase supergroup ????????? 0/user/hbase/onlinecount/_ logs
-Rw-r -?? 3 hbase supergroup ??????? 451/user/hbase/onlinecount/part-m-00000
The import utility can load and export data back to HBase, which is called as follows:
$ bin/hbase org.apache.hadoop.hbase.mapreduce.Import
[Zhouhh @ Hadoop48 ~] $ Hadoop fs-put olc onlinecount
[Zhouhh @ Hadoop48 ~] $ Hbase shell
Hbase (main): 001: 0> create 'Online _ count', 'info'
? [Zhouhh @ Hadoop48 ~] $ Hbase org. apache. hadoop. hbase. mapreduce. Import online_count onlinecount
Related blog posts:
- Convert timestamp in hbase shell to readable format
- ClassNotFoundException in hadoop
- From HDFS data analysis to HBase
Original article address: copy some HBase tables for testing. Thank you for sharing them with me.