A pit that occurs when the Hadoop configuration is pseudo-distributed under Ubuntu 16.4

Source: Internet
Author: User

Single/pseudo-distributed configuration of Spark under ubuntu16.4 I'm not here in one by one. For details, please click on the link below:

Hadoop Installation Tutorial _ standalone/pseudo-distributed configuration _hadoop2.6.0/ubuntu14.04

I have a problem when I configure a pseudo-distributed file and the Namenode is formatted correctly after I did this, I get the following error when I start HDFs:

[Email protected]Virtual-machine:/usr/local/hadoop$./sbin/start-dfs.sh starting namenodes on [localhost]localhost: error:java_home is not   set And could not being found.localhost: error:java_home is not   set And could not being found. Starting secondary namenodes [0.0.0.0]0.0.0.0: Error:java_home isNotSetAnd could not being found.

This error often occurs when your Java environment is not configured correctly, about how to properly configure the Java under Ubuntu please go out to the right to ask Baidu.

When your environment is ubuntu16.4 hadoop2.7.4 and Java has ensured proper configuration, (other online solutions to the error are trying to be outdated) perhaps my solution below will make you steady:

1Vim./etc/hadoop/hadoop-env.sh #打开你hadoop安装路径下的hadoop环境配置脚本2 3# The only required environment variable isJava_home. All others is4# Optional. When running a distributed configuration it is Best to5#SetJava_homeinch  ThisFile, so, it iscorrectly defined on6 # remote nodes.7 8 # The Java implementation to use.9Java_home=/usr/local/java/jdk1.8. 0_151 #你的java路径Ten Export Java_home #添加这两行内容 OneExport Java_home=${java_home} A  -# The JSVC implementation to use. Jsvc isRequired to run secure Datanodes

You can then re-execute:
./sbin/start-
#由于我在写博客时已经运行过, so the prompt is already running
1[Email protected]Virtual-machine:/usr/local/hadoop$./sbin/start-dfs.sh2 starting namenodes on [localhost]3Localhost:namenode Running asProcess5085. Stop it first.4Localhost:datanode Running asProcess5239. Stop it first.5Starting secondary namenodes [0.0.0.0]6 0.0.0.0: Secondarynamenode Running asProcess5407. Stop it first.

Cause of the problem:

The root cause is unknown, the knowledge in check the machine each configuration has no problem, plus a variety of guessing test to get the solution.

Hope this micro-blog can bring you help!

A pit that occurs when the Hadoop configuration is pseudo-distributed under Ubuntu 16.4

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.