標籤:雲帆大資料學院 mahout graphx hive flume sqoop
YARN的Shell操作與管理7.1啟動YARN
YARN有2個守護線程:ResourceManager、NodeManager。
[[email protected] hadoop-2.2.0]$sbin/yarn-daemon.sh start resourcemanager
[[email protected] hadoop-2.2.0]$sbin/yarn-daemon.sh start nodemanager
7.2YARN Web管理介面
YARN管理地址:
ResourceManager: 主機名稱:8088 。本環境中為:http://hadoop-yarn.dragon.org:8088
NameNodeManager:主機名稱:8042 。本環境中為:http://hadoop-yarn.dragon.org:8042
650) this.width=650;" src="http://img.blog.csdn.net/20150102142902250?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvY2xvdWR5aGFkb29w/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/Center" style="border:none;" />
7.3運行MapRecuce程式
(1) $HADOOP_HOME/share/hadoop/mapreduce目錄下有很多例子程式:
Anexample program must be given as the first argument.
Validprogram names are:
aggregatewordcount: An Aggregate basedmap/reduce program that counts the words in the input files.
aggregatewordhist: An Aggregate basedmap/reduce program that computes the histogram of the words in the input files.
bbp: A map/reduce program that usesBailey-Borwein-Plouffe to compute exact digits of Pi.
dbcount: An example job that count thepageview counts from a database.
distbbp: A map/reduce program that uses aBBP-type formula to compute exact bits of Pi.
grep: A map/reduce program that counts thematches of a regex in the input.
join: A job that effects a join over sorted,equally partitioned datasets
multifilewc: A job that counts words fromseveral files.
pentomino: A map/reduce tile laying programto find solutions to pentomino problems.
pi: A map/reduce program that estimates Piusing a quasi-Monte Carlo method.
randomtextwriter: A map/reduce program thatwrites 10GB of random textual data per node.
randomwriter: A map/reduce program thatwrites 10GB of random data per node.
secondarysort: An example defining asecondary sort to the reduce.
sort: A map/reduce program that sorts thedata written by the random writer.
sudoku: A sudoku solver.
teragen: Generate data for the terasort
terasort: Run the terasort
teravalidate: Checking results of terasort
wordcount: A map/reduce program that countsthe words in the input files.
wordmean: A map/reduce program that countsthe average length of the words in the input files.
wordmedian: A map/reduce program that countsthe median length of the words in the input files.
wordstandarddeviation: A map/reduce programthat counts the standard deviation of the length of the words in the inputfiles.
(2) 如何運行這些程式
運行這些例子通過$HADOOP_HOME/bin/yarn jar 命令執行,如:下面的例子是執行pi函數:
[[email protected] hadoop-2.2.0]# bin/yarn jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar
說明:紅色部分:命令 藍色部分:jar檔案所在的路徑
雲帆大資料學院_hdfs和YARN的啟動方式