1.Retrying Connect to server:localhost/127.0.0.1:9000.
Call to localhost/127.0.0.1:9000 failed on connection Exception:java.net.ConnectException:Connection refused
Reason: Hadoop is not started or misconfigured in Core-site.xml
Solution: Start Hadoop or modify the configuration.
2.Exception in thread ' main ' Java.io.IOException:Call to localhost/127.0.0.1:9000 failed on local Exception:java.io.EOFE Xception
This error indicates that the client Hadoop version and the server-side Hadoop version are inconsistent. Modify the version of the client reference Hadoop.
3. Cannot delete/home/hadoop/part. Name node is in safe mode.
Describes the namenode of Hadoop in Safe mode.
So what is the safe mode of Hadoop?
When the Distributed file system is started, there is a safe mode at the beginning, and when the Distributed file system is in Safe mode, the contents of the file system are not allowed to be modified or deleted until Safe mode is finished. The main purpose of security mode is to check the validity of the data blocks on each datanode when the system is started, and to copy or delete some pieces of data according to the necessary policies. The runtime can also enter Safe Mode by command. In practice, when the system is started to modify and delete files will also have Safe mode does not allow modification of error prompts, only need to wait for a while (18s).
Namenode is first entered in Safe mode when it is started, and if the Datanode lost block reaches a certain proportion (1-dfs.safemode.threshold.pct), the system will always be in safe mode or read-only state.
dfs.safemode.threshold.pct (default 0.999f) indicates that when the HDFs is started, if the number of blocks reported by Datanode is up to the metadata It is 0.999 times times the number of blocks in the record to leave safe mode, otherwise this read-only mode is always available. If set to 1 then HDFs is always in safemode.
This line extracts the log from the Namenode startup (block escalation ratio 1 to the threshold 0.9990)
The ratio of reported blocks 1.0000 has reached the threshold 0.9990. Safe mode is turned off automatically in seconds.
Hadoop Dfsadmin-safemode Leave
There are two ways to get out of this safe mode
1. Modify dfs.safemode.threshold.pct to a relatively small value, the default is 0.999.
2. Hadoop dfsadmin-safemode leave command forced to leave
Http://bbs.hadoopor.com/viewthread.php?tid=61&extra=page=1
The user can manipulate safe mode by Dfsadmin-safemode value, as described in parameter value:
Enter-Enter Safe mode
Leave-Force Namenode to leave Safe mode
Get-Returns information on whether Safe mode is open
Wait-waits until Safe mode is finished.
4. Mismatched input ' from ' expecting charsetliteral in character string literal:columnname contains special characters, plus ' It's OK
hive> Select Id,\ ' _bucketname\ ' from default__table02_table02_index__;
Failed:parse Error:line 1:26 cannot recognize input ' from ' in select expression
Hive> Select ID, ' _bucketname ' from default__table02_table02_index__;
See more highlights of this column: http://www.bianceng.cn/Programming/extra/