Win7中使用Eclipse串連虛擬機器中的Ubuntu中的Hadoop2.4<3>

來源:互聯網
上載者:User

  • 經過前幾天的學習,基本上可以小試牛刀編寫一些小程式玩一玩了,在此之前做幾項準備工作
  • 答:
  • 大資料,我的第一反應是現有關係型資料庫中的資料怎麼跟hadoop結合使用,網上搜了一些資料,使用的是DBInputFormat,那就簡單編寫一個從資料庫讀取資料,然後經過處理後,組建檔案的小例子吧
  • 資料庫弄的簡單一點吧,id是數值整型、test是字串型,需求很簡單,統計TEST欄位出現的數量



  • 資料讀取類:
import java.io.DataInput;import java.io.DataOutput;import java.io.IOException;import java.sql.PreparedStatement;import java.sql.ResultSet;import java.sql.SQLException;import org.apache.hadoop.io.Writable;import org.apache.hadoop.mapreduce.lib.db.DBWritable;public class DBRecoder implements Writable, DBWritable{String test;int id;@Overridepublic void write(DataOutput out) throws IOException {out.writeUTF(test);out.writeInt(id);}@Overridepublic void readFields(DataInput in) throws IOException {test = in.readUTF();id = in.readInt();}@Overridepublic void readFields(ResultSet arg0) throws SQLException {test = arg0.getString("test");id = arg0.getInt("id");}@Overridepublic void write(PreparedStatement arg0) throws SQLException {arg0.setString(1, test);arg0.setInt(2, id);}}
  • mapreduce操作類
import java.io.IOException;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IntWritable;import org.apache.hadoop.io.LongWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.Mapper;import org.apache.hadoop.mapreduce.Reducer;import org.apache.hadoop.mapreduce.lib.db.DBConfiguration;import org.apache.hadoop.mapreduce.lib.db.DBInputFormat;import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import org.apache.hadoop.util.GenericOptionsParser;public class DataCountTest {public static class TokenizerMapper extends Mapper<LongWritable, DBRecoder, Text, IntWritable> {public void map(LongWritable key, DBRecoder value, Context context) throws IOException, InterruptedException {context.write(new Text(value.test), new IntWritable(1));}}public static class IntSumReducer extends Reducer<Text, IntWritable, Text, IntWritable> {private IntWritable result = new IntWritable();public void reduce(Text key, Iterable<IntWritable> values,Context context) throws IOException, InterruptedException {int sum = 0;for (IntWritable val : values) {sum += val.get();}result.set(sum);context.write(key, result);}}public static void main(String[] args) throws Exception {args = new String[1];args[0] = "hdfs://192.168.203.137:9000/user/chenph/output1111221";Configuration conf = new Configuration();        DBConfiguration.configureDB(conf, "oracle.jdbc.driver.OracleDriver",                  "jdbc:oracle:thin:@192.168.101.179:1521:orcl", "chenph", "chenph");  String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();Job job = new Job(conf, "DB count");job.setJarByClass(DataCountTest.class);job.setMapperClass(TokenizerMapper.class);job.setReducerClass(IntSumReducer.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class);job.setMapOutputKeyClass(Text.class);  job.setMapOutputValueClass(IntWritable.class);          String[] fields1 = { "id", "test"};          DBInputFormat.setInput(job, DBRecoder.class, "t1", null, "id",  fields1);  FileOutputFormat.setOutputPath(job, new Path(otherArgs[0]));System.exit(job.waitForCompletion(true) ? 0 : 1);}}
--------------------------------------------------------------------------------------------------開發過程中遇到的問題:

相關文章

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.