Ubuntu + Hadoop 2.7 + Hive 1.1.1 + SPRK successfully shared what's the problem we all discuss together

Source: Internet
Author: User
Tags xsl

The installation tutorials I've seen on the internet are really a little sad. Many have failed.

Sharing, the success of their own experiments can be used for reference to the blog:

Recommendation 1 to force Star: Http://www.powerxing.com/install-hadoop/hadoop+spark full reference to his blog, quite to force recommended index of 5 stars

Hive, I have too many articles to refer to here. No one can use, I do not know whether it is my own wrong or what happened.

.....

Finally inadvertently see Hive become a Guide to this book, inside the tutorial installed successfully.


[Email protected]:/usr/local/hadoop/hive# Hive

Hive>


[Email protected]:/usr/local/hadoop/hive# JPS

8100 ResourceManager

8533 Jobhistoryserver

7709 Secondarynamenode

18406 Jps

7514 DataNode

7410 NameNode

8204 NodeManager

[Email protected]:/usr/local/hadoop/hive#


[Email protected]:/usr/local/spark#./bin/spark-shell

Scala>


There are too many documents, so I'll share the pits I've been installing for the first time:

Have to say is the environment variables, stepped on a lot, such as the path is clearly/usr/local written/usr/lacol

Start the program when the general report inexplicable strange mistake, this error, see here the students also pay attention to it, very pit.


There are many configuration files on the web, some of which need to be changed to your own hostname, do not paste over, do not change parameters.


I am sharing a few of the packages I have used to link it;

wget http://www.eu.apache.org/dist/hive/hive-1.1.1/apache-hive-1.1.1-bin.tar.gz Hive

Wget http://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.39.tar.gz Hive Dependent package when connecting to JDBC

Wget http://archive.apache.org/dist/spark/spark-1.6.0/spark-1.6.0-bin-without-hadoop.tgz Spark

wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.7.1/hadoop-2.7.1.tar.gz Hadoop


Native mode sharing for hive: experimental, no problem: Hive's installation package is 1.1.1

Is mainly a configuration file, local mode without modification is particularly responsible. You only need to configure the contents,

The goal is to prevent the metadata store from being in a different directory each time the hive command is executed.


[Email protected]:/usr/local/hadoop/hive/conf# cat Hive-site.xml

<?xml version= "1.0" encoding= "UTF-8" standalone= "no"?>

<?xml-stylesheet type= "text/xsl" href= "configuration.xsl"?><!--

Licensed to the Apache software Foundation (ASF) under one or more

Contributor license agreements. See the NOTICE file distributed with

This is for additional information regarding copyright ownership.

The ASF licenses this file to you under the Apache License, Version 2.0

(the "License"); Except in compliance with

The License. Obtain a copy of the License at


http://www.apache.org/licenses/LICENSE-2.0


Unless required by applicable law or agreed to writing, software

Distributed under the License is distributed on a "as is" BASIS,

Without warranties or CONDITIONS of any KIND, either express or implied.

See the License for the specific language governing permissions and

Limitations under the License.

-

<configuration>

<property>

<name>hive.metastore.warehouse.dir</name>

<value>/usr/local/hadoop/hive</value>

</property>

<property>

<name>javax.jdo.option.ConnectionURL</name>

<!--means that using embedded derby,create to true means that the database is automatically created and the database name is metastore_db-->

<value>jdbc:derby:;d atabasename=/usr/local/hadoop/hive/metastore_db;create=true</value>

<!--means that Derby,hadoopor with customer service mode is the database name, 192.168.0.3 is the IP address of the Derby server, and 4567 is the port number of the server--

<!--<value>jdbc:derby://192.168.0.3:4567/hadoopor;create=true</value>-->

<DESCRIPTION>JDBC connect string for a JDBC metastore</description>

</property>

<property>

<name>javax.jdo.option.ConnectionDriverName</name>

<value>org.apache.derby.jdbc.EmbeddedDriver</value>

<!--<value>org.apache.derby.jdbc.ClientDriver</value>-->

<description>driver class name for a JDBC metastore</description>

</property>

</configuration>

[Email protected]:/usr/local/hadoop/hive/conf#


Using JDBC to manage meta data

Need to have a server with MySQL: I was preparing another one for testing.

[Email protected]:~$ mysql-uroot-pmysqlmysql> CREATE USER ' hive ' identified by ' hive ';mysql> GRANT all privileges On * * to ' hive ' @ '% ' with GRANT option;mysql> flush privileges;


Using JDBC to manage metadata requires the preparation of a JDBC driver, which has been provided with links that can be used:

The MV mysql-connector-java-5.1.39/mysql-connector-java-5.1.39-bin.jar/usr/local/hadoop/hive/lib/


To back up the above hive-site.xml, rewrite the file:

<?xml version= "1.0" encoding= "UTF-8" standalone= "no"?>

<?xml-stylesheet type= "text/xsl" href= "configuration.xsl"?><!--

Licensed to the Apache software Foundation (ASF) under one or more

Contributor license agreements. See the NOTICE file distributed with

This is for additional information regarding copyright ownership.

The ASF licenses this file to you under the Apache License, Version 2.0

(the "License"); Except in compliance with

The License. Obtain a copy of the License at


http://www.apache.org/licenses/LICENSE-2.0


Unless required by applicable law or agreed to writing, software

Distributed under the License is distributed on a "as is" BASIS,

Without warranties or CONDITIONS of any KIND, either express or implied.

See the License for the specific language governing permissions and

Limitations under the License.

-

<configuration>

<property>

<name>javax.jdo.option.ConnectionURL</name>

<value>jdbc:mysql://120.27.7.76/hive?createDatabaseIfNotExist=true</value>

</property>

<property>

<name>javax.jdo.option.ConnectionDriverName</name>

<value>com.mysql.jdbc.Driver</value>

</property>

<property>

<name>javax.jdo.option.ConnectionUserName</name>

<value>hive</value>

</property>

<property>

<name>javax.jdo.option.ConnectionPassword</name>

<value>hive</value>

</property>

</configuration>

[Email protected]:/usr/local/hadoop/hive/conf#


[Email protected]:~# Hive

Hive>


Ubuntu + Hadoop 2.7 + Hive 1.1.1 + SPRK successfully shared what's the problem we all discuss together

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.