Hadoop user launches HDF several processes

Source: Internet
Author: User

Detailed steps to modify:

Three processes started:
namenode:hadoop-01 Bin/hdfs Getconf-namenodes
Datanode:localhost datanodes (using default slaves file) etc/hadoop/slaves
secondarynamenode:0.0.0.0

[Email protected] ~]# Cd/opt/software/hadoop
[Email protected] hadoop]# echo "hadoop-01" >./etc/hadoop/slaves
[Email protected] hadoop]# cat./etc/hadoop/slaves
Hadoop-01

[Email protected] hadoop]# Vi./etc/hadoop/hdfs-site.xml
<property>
<name>dfs.namenode.secondary.http-address</name>
<value>hadoop-01:50090</value>
</property>
<property>
<name>dfs.namenode.secondary.https-address</name>
<value>hadoop-01:50091</value>
</property>

Restart

[[email protected] Hadoop] #sbin/stop-dfs.sh

[[email protected] Hadoop] #sbin/start-dfs.sh

To do a detailed analysis of the steps:

Namenode:hadoop-01

Datnaode:localhost

seondarynamenode:0.0.0.0

For these three processes we want to see how to modify these parameters

    1. Namenode He read the file is Core-site.xml, so we want to modify this file

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://hadoop-01:9000</value>
</property>
</configuration>

The Namenode process was started and changed to HADOOPP-01

[Email protected] hadoop-2.8.1]$ cat/etc/hosts
127.0.0.1 localhost localhost.localdomain localhost4localhost4.localdomain4
:: 1 localhost localhost.localdomain localhost6localhost6.localdomain6
192.168.137.30 hadoop-01 (see our machine name has been configured with the relevant IP address information)

Need to see what the three processes of HDFs startup are, which is written in a shell script

Cat sbin/start-dfs.sh

Such as:

File display namenodes on equals [[email protected] hadoop-2.8.1]$ sbin/start-dfs.sh
Starting namenodes on [localhost], so the local machine name is started

2.datenode is calling the slaves file

Find out what's under slaves and discover that it's localhost, and we're going to modify it.

Overwrite the slaves file and make it hadoop-01

3.seondarynamenode Viewing shell information

We use the command to view the corresponding results:

Open the official website to view the information given by the official website

Http://hadoop.apache.org/docs/r2.8.3/hadoop-project-dist/hadoop-common/SingleCluster.html

Pull the site information down, you can see, click Hdfs-default.xml

In the last lesson, our hdfs-deault.xml is using the default configuration.

Then Ctrl + F looks for secondary, and the first one is the information we need.
Recommend my own creation of Big data Learning Exchange qun:710219868 Big guy has the information, into the Qun chat invitation code fill in the South wind (must fill) have a learning route sharing open class, after listening to know how to learn big data

Hadoop user launches HDF several processes

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.