Trivial-hadoop's GenericOptionsParser class and hadooppath class

Source: Internet
Author: User

Trivial-hadoop's GenericOptionsParser class and hadooppath class

GenericOptionsParser command lineParser

It is a basic class for parsing command line parameters in the hadoop framework. It can identify some standard command line parameters, so that the application can easily specify namenode, jobtracker, and other additional configuration resources.

There is a log written very well, I will not go into details: http://www.cnblogs.com/caoyuanzhanlang/archive/2013/02/21/2920934.html

 

Example:

The simplest use of WordCount is as follows:

 1     Configuration conf = new Configuration(); 2     String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs(); 3     if (otherArgs.length != 2) { 4       System.err.println("Usage: wordcount <in> <out>"); 5       System.exit(2); 6     } 7     Job job = new Job(conf, "word count"); 8     job.setJarByClass(WordCount.class); 9     job.setMapperClass(TokenizerMapper.class);10     job.setCombinerClass(IntSumReducer.class);11     job.setReducerClass(IntSumReducer.class);12     job.setOutputKeyClass(Text.class);13     job.setOutputValueClass(IntWritable.class);14     FileInputFormat.addInputPath(job, new Path(otherArgs[0]));15     FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));16     System.exit(job.waitForCompletion(true) ? 0 : 1);

For example, run the command bin/hadoop dfs-fs master: 8020-ls/data

GenericOptionsParser configure-fs master: 8020 to configure conf

The getRemainingArgs () method returns the remaining parameter-ls/data. Input and Output Parameters

 


How to Write hadoop programs and submit jobs?

Submit command: bin/hadoop jar B. jar wordcount/input/output, where jar is the file type, B. jar file name, wordcount main class name
 
Hadoop uses the console to compile and execute the built-in program WordCountjava error. How can this problem be corrected?

Classpath is more than that. You can check it out ~ /Hadoop-0.20.2/bin/hadoop-config file, where $ CLASSPATH is required to run the hadoop program.
It contains
$ HADOOP_COMMON_HOME/build/classes
: $ HADOOP_COMMON_HOME/build
$ HADOOP_COMMON_HOME/build/test/classes
$ HADOOP_COMMON_HOME/build/test/core/classes
: $ HADOOP_COMMON_HOME

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.