1. View hadoop startup and running status using commands and log files
On the NameNode side, you can use
tail -100 /var/log/hadoop/hadoop/hadoop-hadoop-namenode-hadoop-namenode.log
View NameNode running logs
You can also use
cat /var/log/hadoop/hadoop/hadoop-hadoop-datanode-hadoop-datanode1.log
View the running logs of DataNode.
Run the jps command on datanode and namenode respectively to view the started services.
2. NameNode cannot be started:
Cannot lock storage ...... Tmp/dfs/name. The directory is already locked.
This may be because the account for executing hadoop has no permission for the folder tmp/dfs/name. You can use the following command to solve the problem:
chown -R hadoop:hadoop /usr/hadoop
3. DataNode cannot be started:
The client log shows namenode namespaceID = 1713611278; datanode namespaceID = 596511341
This problem is basically caused by running hadoop namenode-format multiple times on the namenode side. Find <name> hadoop. tmp. dir </name> In the hadoop core-site.xml file (different hadoop versions have different names) and clear the corresponding folder. Example:
[hadoop@hadoop-datanode1 hadoop]$ cat core-?xml version=?>?xml-stylesheet type= href=?>!-- Put site-specific property overrides this file. -->>!--global properties -->>>hadoop.tmp.dir</name>>/usr/hadoop/tmp</value>/property>
Clear
[hadoop@hadoop-datanode1 tmp]$ rm -rf /usr/hadoop/tmp/*
Then restart hadoop and use jps on datanode to check whether datanode has been started.
4. When running the wordcount program, the fs cannot find the folder:
Input path does not exist: hdfs: // localhost: 9000/user/input
In the cluster environment, all processed files are in hdfs. Therefore, you must copy the files to be processed to a hadoop folder. The following example shows how to create a folder in fs, copy the prepared wordcount file to hdfs, and finally run the program.
[hadoop@hadoop-namenode ~]$ hadoop fs -mkdir /tmp/wordcount/@hadoop-namenode ~]$ hadoop fs -put /home/hadoop/wordcount/input /tmp/wordcount/@hadoop-namenode ~]$ hadoop fs -ls /tmp/wordcount//home/hadoop/hadoop-examples-1.1.2.jar wordcount /tmp/wordcount/input/input /tmp/wordcount/output
View results
hadoop fs -cat /tmp/wordcount/output/part-r-00000