標籤:hotspot namenode word 適配 web etl data exception hba
由於需要首次手動安裝sbt,需要連網,故將虛擬機器的網路介面卡模式設定為"橋接模式",這樣就可以和互連網相串連。
但是後面執行"spark-shell --master yarn --deploy-mode client" 命令時,無法啟動,一直停留在中間狀態不動,
如下:
[[email protected] test_code]# spark-shell --master yarn --deploy-mode client
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/05/07 18:07:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platfor
m... using builtin-java classes where applicable
後來突然想起來,虛擬機器的網路介面卡模式沒有更改過來,重新設定為"僅主機模式" 後(這是由於,安裝Hadoop叢集
時,幾個VMware虛擬機器都是使用的"僅主機模式"),spark-shell 正常啟動,如下:
[[email protected] master]# spark-shell --master yarn --deploy-mode client
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/05/07 18:30:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/05/07 18:30:28 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
17/05/07 18:31:15 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Spark context Web UI available at http://192.168.1.200:4040
Spark context available as ‘sc‘ (master = yarn, app id = application_1494142860645_0001).
Spark session available as ‘spark‘.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ ‘_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
BTW:
通過查看active NameNode 狀態,可以看到,叢集狀態不正常,無法與其他節點通訊,造成block丟失。如下WebUI 上所示:
There are 35 missing blocks. The following files may be corrupted:
blk_1073741947/opt/hadoop/out_wordcount4/part-r-00000
blk_1073741946/opt/hadoop/out_wordcount2/part-r-00000
blk_1073741945/opt/hadoop/out_wordcount/part-r-00000
blk_1073741933/opt/hadoop/input/README.txt
blk_1073741931/hbase/MasterProcWALs/state-00000000000000000018.log
blk_1073741930/hbase/oldWALs/slave4%2C16020%2C1489814171196.meta.1489821379742.meta
blk_1073741929/hbase/oldWALs/slave3%2C16020%2C1489814171011.1489821375778
blk_1073741928/hbase/oldWALs/slave4%2C16020%2C1489814171196.1489821375970
blk_1073741927/hbase/oldWALs/slave5%2C16020%2C1489814170009.1489821374298
blk_1073741920/hbase/data/hbase/meta/1588230740/info/8e011b40156f4eeab4e83caf63ee1d23
blk_1073741847/hbase/data/hbase/namespace/3792ee8c4881d96201d73a19d76aa598/info/cc6c271a546248419df4c0988d191b4d
blk_1073741846/hbase/data/hbase/namespace/3792ee8c4881d96201d73a19d76aa598/.regioninfo
blk_1073741845/hbase/data/hbase/namespace/.tabledesc/.tableinfo.0000000001
blk_1073741968/linkage/block_9.csv
blk_1073741967/linkage/block_8.csv
blk_1073741839/hbase/data/hbase/meta/.tabledesc/.tableinfo.0000000001
blk_1073741966/linkage/block_7.csv
blk_1073741838/hbase/data/hbase/meta/1588230740/.regioninfo
blk_1073741837/hbase/hbase.id
blk_1073741965/linkage/block_6.csv
blk_1073741836/hbase/hbase.version
blk_1073741964/linkage/block_5.csv
blk_1073741963/linkage/block_4.csv
blk_1073741962/linkage/block_3.csv
blk_1073741961/linkage/block_2.csv
blk_1073741833/tmp/hadoop-yarn/staging/history/done_intermediate/root/job_1489568197327_0001.summary
blk_1073741960/linkage/block_1.csv
blk_1073741832/out/part-r-00000
blk_1073741959/linkage/block_10.csv
blk_1073741958/sogou/SogouQ1.txt
blk_1073741957/tmp/hadoop-yarn/staging/history/done_intermediate/root/job_1492443686126_0001_conf.xml
blk_1073741956/tmp/hadoop-yarn/staging/history/done_intermediate/root/job_1492443686126_0001-1492443717631-root-word+count-1492443749993-1-1-SUCCEEDED-default-1492443728809.jhist
blk_1073741955/tmp/hadoop-yarn/staging/history/done_intermediate/root/job_1492443686126_0001.summary
blk_1073741954/opt/hadoop/out_1/part-r-00000
blk_1073741825/word
Please check the logs or run fsck in order to identify the missing blocks. See the Hadoop FAQ for common causes and potential solutions.
Spark-shell 無法啟動之網路問題