Install hive (standalone mode with MySQL connection)

Source: Internet
Author: User
Tags chmod hadoop fs log4j

Install hive (use MySQL connection in standalone mode) 1. JAVA+HADOOP2 is installed by default. Download the installation package for Hadoop version 3. Unzip the installation package tar ZXVF Apache-hive-1.2.1-bin.tar.gz4. Installing mysqlyum-y install mysql-server MySQL mysqldev//need to run as root additionally you may need to configure the Yum source mysql common command: Service mysqld Start/stopchkconfig mysqld on//Join power-on start with system root user 5. Authorizing MySQL (operating as system root of the Hadoop identity database) mysqladmin-u root Password "root"//change root password to rootmysql-uroot-p password (empty initial password) create user ' hive ' identified by ' hive '; Create the hive user password for the connection hivegrant all privileges on * * to ' hive ' @ '% ' identified by "Hive" with grant Option;flush privileges; Refresh permissions grant all privileges on *. * to ' hive ' @ ' localhost ' identified by ' hive ' with grant Option;flush privileges; Refresh permissions grant all privileges on *. * to ' hive ' @ ' hadoop.master ' identified by "Hive" with grant Option;flush privileges; Refresh permissions set global binlog_format= ' MIXED '; The formatting must be performed. Otherwise the error exit;service mysqld restart//Restart Service 6. Test Connection mysql-hhadoop.master-uhive-phive//can go in the setup is successful create DATABASE hive; Create a connection database Hivealter databases hive character set LATIN1;7. Configure Environment variables (/ETC/profile) #hiveexport Hive_home=/opt/hive-1.2.1export hive_aux_jars_path=/opt/hive-1.2.1/libexport HIVE_CONF_DIR= /opt/hive-1.2.1/confexport path= $PATH: $HIVE _home/binexport classpath= $CLASSPATH: $HIVE _home/lib Save Exit source/etc/ Profile8. Modifying a configuration file 1. Copy profile according to template CP hive-default.xml.template HIVE-SITE.XMLCP hive-env.sh.template HIVE-ENV.SHCP Hive-log4j.properties.template hive-log4j.properties2. Modify the configuration file # # # #hive-site.xml####//Add a project--0.11 version you don't have to add this item < property><name>hive.metastore.local</name><value>false</value></property>// Modify Project <property><name>javax.jdo.option.connectionurl</name><value>jdbc:mysql://    HADOOP.MASTER:3306/HIVE&LT;/VALUE&GT;&LT;DESCRIPTION&GT;JDBC connect string for a JDBC metastore</description> </property><property><name>javax.jdo.option.ConnectionDriverName</name><value> Com.mysql.jdbc.driver</value><description>driver class name for a JDBC metastore</description> < /property><property><name>javax.jdo.option.connectionusername</name><value>hive</ Value><description>username to use against Metastore database</description> </property>< property><name>javax.jdo.option.connectionpassword</name><value>hive</value>< Description>password to use against Metastore database</description></property><property>< Name>hive.exec.local.scratchdir</name><value>/opt/hive-1.2.1/tmp</value>//need to create this directory < Description>local scratch space for Hive jobs</description></property><property><name> Hive.downloaded.resources.dir</name><value>/opt/hive-1.2.1/tmp</value><description> Temporary local directory for added resources in the remote file system.</description></property>< Property><name>hive.hwi.war.file</name><value>/opt/hive-1.2.1/lib/hive-hwi-1.2.1.jar&lT;/value><description>this sets the path to the HWI war file, relative to ${hive_home}. </description> </property>### #hive-env.sh### #HADOOP_HOME =/opt/hadoop-2.5.2### #hive-log4j.properties # # # #hive. Log.threshold=allhive.root.logger=info,drfahive.log.dir=/opt/hive-1.2.1/logs//need to create the appropriate directory hive.log.file= Hive.log9. Create the appropriate folder on other configuration items 1.hdfs and Modify permissions Hadoop fs-mkdir-p/tmp/hivehadoop fs-chmod 777/tmp/hivehadoop fs-mkdir-p/user/hiv Ehadoop fs-chmod 777/user/hive2. Modify the Hadoop hadoop-env.sh configuration file export hadoop_classpath= $HADOOP _classpath:$ CLASSPATH3. Copy the MySQL JDBC Jar package into the Lib directory under the CP mysql-connector-java-5.1.21.jar/opt/hive-1.2.1/lib4. Jline-2.12.jar in the Lib directory Copy to/opt/hadoop-2.5.2/share/hadoop/yarn/lib and rename the corresponding package to the name cp/opt/hive-1.2.1/lib/jline-2.12.jar/opt/hadoop-2.5.2/ share/hadoop/yarn/libmv/opt/hadoop-2.5.2/share/hadoop/yarn/lib/jline-0.9.94.jar/opt/hadoop-2.5.2/share/hadoop/ Yarn/lib/jline-0.9.94.jar.bak9. Using the Verify Hive--service Metastore &//Boot First execution hive-e "show databases;"//Run out without an error, the installation will be successfulUse the Hive command to proceed to execute 10. Common Commands 1. Show Tables;show databases;2. Define//Add external partition table it is recommended that you use this table to create external table Access_info (IP String,access_date String,url String) partitioned by (Logdate string) row format delimited fields terminated by ' \ t ' desc acc Ess_info;3. Add data ALTER TABLE Access_info add partition (logdate= ' 2016-01-15 ') location '/access '; --load file HDFs actual path access for folder name Load data local inpath '/home/hadoop/huangzhijian/access.txt ' into table Access_info_local_  File  --Load local file 3. Query select * from access_info;4. Delete drop table access_info;  The external table does not delete the original data inside the table will delete the raw data//note cannot be update5. Other 1.hive–f Test.sql # # # #test. sql#### select * from T1;    Select COUNT (*) from T1; 2.hive-e ' hql statement ' 3.hive-s-e ' select * from T1 ' (usage as in the first mode of silent mode, does not show MapReduce operation process) 4.HIVE-E ' select * from T1 ' > tes T.txt (Output the results to a local file)

  

Install hive (standalone mode with MySQL connection)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.