Hadoop Common Errors

Source: Internet
Author: User
Tags hadoop fs

1. View hadoop startup and running status using commands and log files

On the NameNode side, you can use

tail -100 /var/log/hadoop/hadoop/hadoop-hadoop-namenode-hadoop-namenode.log

View NameNode running logs

 

You can also use

cat /var/log/hadoop/hadoop/hadoop-hadoop-datanode-hadoop-datanode1.log

View the running logs of DataNode.

 

Run the jps command on datanode and namenode respectively to view the started services.

 

2. NameNode cannot be started:

Cannot lock storage ...... Tmp/dfs/name. The directory is already locked.

This may be because the account for executing hadoop has no permission for the folder tmp/dfs/name. You can use the following command to solve the problem:

chown -R hadoop:hadoop /usr/hadoop

 

3. DataNode cannot be started:

The client log shows namenode namespaceID = 1713611278; datanode namespaceID = 596511341

This problem is basically caused by running hadoop namenode-format multiple times on the namenode side. Find <name> hadoop. tmp. dir </name> In the hadoop core-site.xml file (different hadoop versions have different names) and clear the corresponding folder. Example:

[hadoop@hadoop-datanode1 hadoop]$ cat core-?xml version=?>?xml-stylesheet type= href=?>!-- Put site-specific property overrides  this file. -->>!--global properties -->>>hadoop.tmp.dir</name>>/usr/hadoop/tmp</value>/property>

Clear

[hadoop@hadoop-datanode1 tmp]$ rm -rf /usr/hadoop/tmp/*

Then restart hadoop and use jps on datanode to check whether datanode has been started.

 

4. When running the wordcount program, the fs cannot find the folder:

Input path does not exist: hdfs: // localhost: 9000/user/input

In the cluster environment, all processed files are in hdfs. Therefore, you must copy the files to be processed to a hadoop folder. The following example shows how to create a folder in fs, copy the prepared wordcount file to hdfs, and finally run the program.

[hadoop@hadoop-namenode ~]$ hadoop fs -mkdir /tmp/wordcount/@hadoop-namenode ~]$ hadoop fs -put /home/hadoop/wordcount/input /tmp/wordcount/@hadoop-namenode ~]$ hadoop fs -ls /tmp/wordcount//home/hadoop/hadoop-examples-1.1.2.jar wordcount /tmp/wordcount/input/input /tmp/wordcount/output

 

View results

hadoop fs -cat /tmp/wordcount/output/part-r-00000

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.