CentOS 6.5 偽分布式 安裝 hadoop 2.6.0

來源:互聯網
上載者:User

標籤:

安裝 jdk

1 yum install java-1.7.0-openjdk*3 檢查安裝:java -version

 

建立Hadoop使用者,設定Hadoop使用者使之可以免密碼ssh到localhost

1 su - hadoop  2 ssh-keygen -t dsa -P ‘‘ -f ~/.ssh/id_dsa    3 cat ~/.ssh/id_dsa.pub>> ~/.ssh/authorized_keys  4   5 cd /home/hadoop/.ssh   6 chmod 600 authorized_keys  

注意這裡的許可權問題,保證.ssh目錄許可權為700,authorized_keys為600

驗證:

1 [[email protected] .ssh]$ ssh localhost  2 Last login: Sun Nov 17 22:11:55 2013  

 

解壓hadoop,安裝在/opt/hadoop

1 tar -xzvf hadoop-2.6.0.tar.gz2 mv -i /home/erik/hadoop-2.6.0 /opt/hadoop  3 chown -R hadoop /opt/hadoop  

 

要修改的檔案有hadoop-env.sh、core-site.xml  、 hdfs-site.xml 、 yarn-site.xml 、mapred-site.xml幾個檔案。

1 cd /usr/opt/hadoop/etc/hadoop  

 

設定hadoop-env.sh中的java環境變數,改成這樣JAVA_HOME好像沒效

1 export JAVA_HOME= {你的java環境變數} 

 

core-site.xml

 1 <configuration>   2     <property>   3         <name>hadoop.tmp.dir</name>   4         <value>/opt/hadoop/tmp</value> 5     </property>   6     <property>   7         <name>fs.default.name</name>   8         <value>localhost:9000</value>   9     </property>  10 </configuration> 

 

hdfs.xml

 1 <configuration>  2 <property> 3         <name>dfs.replication</name> 4         <value>1</value> 5     </property> 6     <property> 7         <name>dfs.namenode.name.dir</name> 8         <value>/opt/hadoop/dfs/name</value> 9     </property>10     <property>11         <name>dfs.datanode.data.dir</name>12         <value>/opt/hadoop/dfs/data</value>13     </property>14     <property>15             <name>dfs.permissions</name>16             <value>false</value>17      </property>18  </configuration>    

 

yarn-site.xml

 1 <configuration> 2 <property> 3 <name>mapreduce.framework.name</name> 4 <value>yarn</value> 5 </property> 6    7 <property> 8 <name>yarn.nodemanager.aux-services</name> 9 <value>mapreduce_shuffle</value>10 </property>11 </configuration>

 

mapred-site.xml

1 <configuration>2 <property>3 <name>mapred.job.tracker</name>4 <value>localhost:9001</value>5 </property>6 </configuration>

 

配置環境變數,修改/etc/profile, 寫在最後面即可。配置完要重啟!!!

 1 export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.95.x86_64 2 export JRE_HOME=$JAVA_HOME/jre 3 export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/bin 4 export HADOOP_INSTALL=/opt/hadoop 5 export PATH=${HADOOP_INSTALL}/bin:${HADOOP_INSTALL}/sbin${PATH} 6 export HADOOP_MAPRED_HOME=${HADOOP_INSTALL} 7 export HADOOP_COMMON_HOME=${HADOOP_INSTALL} 8 export HADOOP_HDFS_HOME=${HADOOP_INSTALL} 9 export YARN_HOME=${HADOOP_INSTALLL}10 export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_INSTALL}/lib/natvie11 export HADOOP_OPTS="-Djava.library.path=${HADOOP_INSTALL}/lib:${HADOOP_INSTALL}/lib/native"

 

之後就是見證奇蹟的時候了,

1 cd /opt/hadoop/

 

格式化hdfs

1 bin/hdfs namenode -format 

 

啟動hdfs

1 sbin/start-dfs.sh 2 sbin/start-yarn.sh

 

理論上會見到

1 Starting namenodes on [localhost]  2 localhost: starting namenode, logging to /usr/opt/hadoop-2.6.0/logs/hadoop-hadoop-namenode-.out  3 localhost: starting datanode, logging to /usr/opt/hadoop-2.6.0/logs/hadoop-hadoop-datanode-.out  4 Starting secondary namenodes [0.0.0.0]  5 0.0.0.0: starting secondarynamenode, logging to /usr/opt/hadoop-2.6.0/logs/hadoop-hadoop-secondarynamenode-.out  

輸入網址127.0.0.1:50070就可以看見hadoop的網頁了,這就說明成功了。

 

參考:

http://www.centoscn.com/hadoop/2015/0118/4525.html

http://blog.csdn.net/yinan9/article/details/16805275

http://www.aboutyun.com/thread-10554-1-1.html

CentOS 6.5 偽分布式 安裝 hadoop 2.6.0

相關文章

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.