Start Step:
To modify a Hadoop configuration file
Conf/mapred-site.xml, add the following:
<property> <name>mapred.job.tracker</name> <value>localhost:9001</value> &L T;/property>
Conf/core-site.xml, add the following:
<property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property>
Conf/hdfs-site.xml, add the following:
<property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.permissions</name> <value>false</value> </proper Ty>
conf/hadoop-env.sh, add the following:
Export java_home=/usr/lib/jvm/jdk1.6.0_45
Where java_home points to my JDK installation path
2. Set Password-free login
Since Hadoop starts the start/Stop script with ssh command to start the relevant daemon, in order to avoid each start/stop Hadoop input password authentication, you need to set up password-free login, the following steps:
1) Open the shell terminal and enter the following command:
SSH-KEYGEN-T RSA
The public key file Id_rsa.pub and the private key file are generated under the "~/.ssh/" directory Id_rsa
1) Copy the contents of the public key file id_rsa.pub to the same directory as the Authroized_keys file:
CD ~/.ssh/cat id_rsa.pub >> Authroized_keys
3. Start Hadoop
1) format HDFs:
./hadoop Namenode-format
2) Start Hadoop
./start-all.sh
In my Ubuntu desktop version, the following error is reported because SSH server is not installed by default:
Connect to host localhost Port 22:connection refused
The workaround is to install SSH server to
Connect to host localhost Port 22:connection refused
After successful startup, use the command to confirm that Hadoop has started successfully.
[Email protected]:~/release-1.0.1/bin$ netstat-ant|grep 50030tcp6 0 0::: 50030:::* LISTEN [email protected]:~/release-1.0.1/bin$ netstat-ant|grep 50070tcp6 0 0::: 50070 :::* LISTEN
Access localhost:50030/,localhost:50070/with a browser to view the operation of MapReduce and HDFs.
Start Hodoop for the first time