Install hadoop + hive in ubantu

Source: Internet
Author: User
Tags hadoop fs
Ubanto-build and install VM in hadoop Environment

Download: Go to the official website under the VMware-player-5.0.1-894247.zip

Install and configure ubanto

Download: Go to the official website under the ubuntu-12.10-desktop-i386.iso

Open the VM, load the ubanto ISO file, and install and update the file.

Enter ubanto. If this is the first entry, you need to set the root password.

> Sudo passwd Root

Create user

# Sudo useradd Username
# Sudo passwd Username

 

Simple Vim Configuration

Syntax on

Set ruler

Set autoindent

Set cindent

Set hlsearch

 

Install and configure JDK

First decompress * .tar.gz

The specific method is tar-xzvf * .tar.gz.

Assume that the obtained folder is Java

Move it to/usr/

Command: sudo MV Java/usr/

Then set the environment variable:

Sudo gedit/etc/profile

Open a file

Before umask 022 at the end, enter:

Java_home =/usr/Java
Export jre_home =/usr/Java/JRE
Export classpath =.: $ java_home/lib: $ jre_home/lib: $ classpath

Export Path = $ java_home/bin: $ jre_home/bin: $ path

Modify the default JDK in Ubuntu

Update-alternatives -- install/usr/bin/Java/usr/Java/bin/Java 300
Update-alternatives -- install/usr/bin/javac/usr/Java/bin/javac 300

In this step, add the installed JDK to the Java menu.

Update-alternatives -- config Java

Select the default JDK

Java-version

Check OK

Install and configure SSH key-free

Install sudo apt-Get Install SSH

Check IP: ifconfig

Root @ Ubuntu :~ # Ssh-keygen-t dsa-p'-f ~ /. Ssh/id_dsa
Root @ Ubuntu :~ # Cat ~ /. Ssh/id_dsa.pub> ~ /. Ssh/authorized_keys

 

./Hadoop namenode-format

Sh starl-all.sh

PS-Ef view process PS-Ef | hadoop

Sh stop-all.sh

 

Install and configure hadoop

Download: Go to the official website under the hadoop-1.0.4.tar.gz

Decompress the package to a specific path. Here we select/root/

Configure hadoop:/root/hadoop-1.0.4/conf to configure the XML in this configuration path

 

Create a datas path in the hadoop directory, create HDFS and TMP paths in datas, and create name and data paths in HDFS.

Configure

Core-site.xml

<Property>

<Name> fs. Default. Name </Name>

<Value> HDFS: // 192.168.119.128: 8898 </value>

</Property>

 

<Property>

<Name> hadoop. tmp. dir </Name>

<Value>/root/hadoop-1.0.4/datas/tmp </value>

</Property>

 

Hdfs-site.xml

<Property>

<Name> DFS. Name. dir </Name>

<Value>/root/hadoop-1.0.4/datas/HDFS/name/</value>

</Property>

<Property>

<Name> DFS. Data. dir </Name>

<Value>/root/hadoop-1.0.4/datas/HDFS/data/</value>

</Property>

 

Mapred-site.xml

<Property>

<Name> mapred. Job. Tracker </Name>

<Value> 192.168.119.128: 8899 </value>

</Property>

 

Masters

Slaves

1. If it does not run properly, prompt java_home does not find the path, you need to edit in CONF/hadoop-env.sh to specify java_home path.

2. If it still cannot run, or access 50030 cannot access 50070, it means that namenode is not started normally, format it again, restart and try: hadoop namenode-format (y)

 

Problem:

1 is the browse the filesystem link on the hadoop status page invalid?

Or the webpage in the window cannot be localhost?

Configure host address ing

C: \ windows \ system32 \ drivers \ etc

10.15.82.48 Ubuntu

10.15.82.48 VM. HFX. localhost

 

System Address Configuration of the hadoop Server

#10.15.82.48 localhost

10.15.82.48 10.15.82.48.localhost

127.0.1.1 Ubuntu

Root @ Ubuntu:/etc # vi hosts

 

2 warning: $ hadoop_home is deprecated.

After checking the hadoop-1.0.3/bin/hadoop script and the hadoop-config.sh script, we found that the environment variable setting of hadoop_home in the script makes a judgment, and the environment of the author does not need to set hadoop_home environment variable.
Solution 1: Go to the home directory to edit the. bash_profile (. bashrc under ubanto) file, remove the hadoop_home variable settings, and re-enter the hadoop FS command. The warning disappears.
Solution 2: edit the. bash_profile file in the home directory, add an environment variable, and the warning disappears:

Export hadoop_home_warn_suppress = 1

My situation is: in the computer environment variables have set hadoop_home path, and later in the hadoop-env.sh and set export hadoop_home = E:/hadoop/hadoop-1.0.3, later, I commented out the comment in the file and did not prompt

 

Install and configure MySQL

Sodu RPT-install rpm

Rpm-Qa | grep MySQL

Check whether MySQL information is available. If yes, it indicates that MySQL has been installed.

Download: sodu apt-Get install mysql-Server

Root @ Ubuntu:/usr/bin #./MySQL-u root-P

111111

Show databases;

Mysql> use test;

Database changed

Mysql> show tables;

Mysql> Create Table Table1 (A int, B INT );

Mysql> show tables;

+ ---------------- +

| Tables_in_test |

+ ---------------- +

| Table1 |

+ ---------------- +

1 row in SET (0.00 Sec)

 

Mysql> insert into Table1 (a, B) values (1, 1 );

Query OK, 1 row affected (0.01 Sec)

Mysql> insert into Table1 (a, B) values (1, 2 );

Query OK, 1 row affected (0.00 Sec)

Mysql> insert into Table1 (a, B) values (1, 3 );

Query OK, 1 row affected (0.00 Sec)

Mysql> select * From Table1;

+ ------ +

| A | B |

+ ------ +

| 1 | 1 |

| 1 | 2 |

| 1 | 3 |

+ ------ +

3 rows in SET (0.02 Sec)

 

Install and configure hive

Download: hive-0.9.0.tar.gz

Decompress the package to a path,

First, copy the extracted mysql-connector-java-5.1.22-bin.jar to the Lib path under/hive

Then configure the hive-size.xml, this configuration can refer to the official website

<Property>

<Name> hive. MetaStore. Local </Name>

<Value> true </value>

</Property>

<Property>

<Name> javax. JDO. Option. connectionurl </Name>

<Value> JDBC: mysql: // VM. HFX. localhost: 3306/hive? Createdatabaseifnotexist = true </value>

</Property>

<Property>

<Name> javax. JDO. Option. connectiondrivername </Name>

<Value> com. MySQL. JDBC. Driver </value>

</Property>

<Property>

<Name> javax. JDO. Option. connectionusername </Name>

<Value> hadoop (MySQL user name) </value>

</Property>

<Property>

<Name> javax. JDO. Option. connectionpassword </Name>

<Value> hadoop (password) </value>

</Property>

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.