1 PrefaceFirst you have to accompany HBase, you can see http://www.cnblogs.com/liuchangchun/p/4096891.html, fully distributed similar2 HBase Configure 2.1 HUE profile settings, locate the HBase label, and configure the following# comma-separated List of HBase Thrift servers forClusters in the format of ' (Name|host:port) '. # Use full hostname with security. # If using Kerberos we assume GSSAPI SASL, not PLAIN. Hbase_clusters= (cluster1|spark-1421-0002:9090) # hbase
There are already many tutorials on how to configure hadoop on the Internet. With the instructions on the hadoop homepage, you can configure hadoop clusters on multiple machines. Here we record the problems I encountered during actual configuration and use of hadoop, some of
Reference documents:http://blog.csdn.net/inkfish/article/details/5168676http://493663402-qq-com.iteye.com/blog/1515275Http://www.cnblogs.com/syveen/archive/2013/05/08/3068044.htmlHttp://www.cnblogs.com/kinglau/p/3794433.htmlEnvironment: Vmware11 Ubuntu14.04 LTS, Hadoop2.7.1One: Create an account1. Create Hadoop groups and groups under Hadoop users[Email protected]:~$ sudo adduser--ingroup
My development environment:
Operating System centos5.5 one namenode two datanode
Hadoop version: hadoop-0.20.203.0
Eclipse version: eclipse-java-helios-SR2-linux-gtk.tar.gz (with version 3.7 always crashes, depressing)
Step 1: Start the hadoop daemon first
See http://www.cnblogs.com/flyoung2008/archive/2011/11/29/2268302.html for details
Step 2: Install the
Hadoop 2.7.1 high-availability installation configuration based on QMJ
Hadoop 2.7.1 high-availability installation configuration based on QMJ
1. Modify the Host Name and hosts file
10.205.22.185 nn1 (main) function namenode, resourcemanager, datanode, zk, hive, sqoop10.205.22.186 nn2 (standby) function namenode, resour
Hello, everyone, let me introduce you to Ubuntu. Eclipse Development Hadoop Application Environment configuration, the purpose is simple, for research and learning, the deployment of a Hadoop operating environment, and build a Hadoop development and testing environment.
Environment: Vmware 8.0 and Ubuntu11.04
The first
the Java variable and specify the Java directory
[[emailprotected]conf]$pwd/usr/hadoop/hadoop-1.2.1/conf[[emailprotected]conf]$vimhadoop-env.sh#JAVAenvironmentsettingexportJAVA_HOME=/usr/java/jdk1.7.0_65
The hadoop configuration file is in the/usr/hadoop/
Next, configure hadoop,
1. decompress the file
Open cygwin and enter the following command:
CD.
Explorer.
A new window will pop up, put the original hadoop compressed file in it, and decompress it. In my opinion, it is not necessary to put it in the cygwin user root directory. I have never tried it.
Ii. Configure hadoop
Open the decompressed folder,
1. Environment vmware10,centos6.4,64 bit, JDK1.8, Hadoop2.7 requires 64-bit Linux2, install jdk,hadoop2.7 need JDK7, support JDK1.8, directly unzip the downloaded JDK and configure the variable can be(1) Download "jdk-7u79-linux-x64.gz" and put it in the/usr/directory(2) Decompression, input command, TAR-ZXVF jdk-7u79-linux-x64.gz(3) Edit Vi/etc/profileExport java_home=/usr/java/jdk1.7.0_79Export classpath=.: $JAVA _home/jre/lib/rt.jar: $JAVA _home/lib/dt.jar: $JAVA _home/lib/tools.jarExport pat
Statement: For hadoop-related information, please refer to the official documentation and select different versions as needed:
Current version, http://hadoop.apache.org/docs/current/
Version list, http://hadoop.apache.org/docs/
This article uses hadoop version 0.20.2 and Mac OS X 10.7.5.0. Download hadoop
Https://archive.apache.org/dist/
1. An overview of the establishment of the cloud platform for colleges and universities started a few days ago. The installation and configuration of the hadoop cluster test environment took about two days, I finally completed the basic outline and shared my experience with you. Ii. hardware environment 1, Windows 7 flagship edition 64-bit 2, VMWare Workstation ace version 6.0.23, RedHat Linux 54,
Hadoop Common configuration Item "Go"Core-site.xml
Name
Value
Description
Fs.default.name
hdfs://hadoopmaster:9000
Defines the URI and port of the Hadoopmaster
Fs.checkpoint.dir
/opt/data/hadoop1/hdfs/namesecondary1
Define the path to the name backup of Hadoop, the official document says read
tab, the entire row is null as the Key,value value.
Specific parameter tuning can refer to http://www.uml.org.cn/zjjs/201205303.asp basic usage
Hadoophome/bin/hadoopjar Hadoop_home/bin/hadoop jar\ hadoop_home/share/hadoop/tools/lib/ Hadoop-streaming-2.7.3.jar [Options]
Options
--input: Input file path
--output: Output file path
--mapper: The user writes the ma
Today is finally the hadoop2.4 of the entire development environment, including the Windows7 on the Eclipse connection Hadoop,eclipse configuration and test made irritability of the ~ ~First on a successful picture, Hadoop's pseudo-distributed installation configuration, just follow the steps, a little basic basically no problem. The eclipse
For the first time, Hadoop was configured on the VM, and three virtual machines were created, one as namenode and jobtracker.
The other two machines are used as datanode and tasktracker.
After configuration, start the Cluster
View cluster status through http: // localhost: 50700
No datanode found
Check the node and find that the datanode process has been started. view the logs on the datanode machine.
2014
-srcbian/ubuntu)Patch with the following commandCD hadoop-common-project/hadoop-common/srcwget https://issues.apache.org/jira/ Secure/attachment/12570212/hadoop-9320.patchPatch 9320. Patch6. Compile the source codeMVN compile-pnative7. After the compilation OK, packaging, do not do one of the test links, less memory, can't playMVN package-pnative-dtar-dskiptests
Core-site.xml
Name
Value
Description
Fs.default.name
hdfs://hadoopmaster:9000
Defines the URI and port of the Hadoopmaster
Fs.checkpoint.dir
/opt/data/hadoop1/hdfs/namesecondary1
Define the path to the name backup of Hadoop, the official document says read this, write Dfs.name.dir
Fs.checkpoint.period
1800
Defines the backup interval for name backup, in sec
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.