mysql version number: mysql-5.1.47 (installed on Linux redhat)
first is the installation of hive embedded mode,
The default database for hive embedded installation is Derby,
embedded mode installation cannot be used for actual work,
that this mode of operation environment can not support bidirectional synchronization or cluster work.
can be used to test and deploy the hive installation correctly,
until hive is properly running in embedded mode,
can be based on the configuration file to do a simple deployment of the MySQL installation configuration.
can first create a good directory:
mkdir-p/usr/hive&http://www.aliyun.com/zixun/aggregation/37954.html ">nbsp; Used to store the corresponding files after decompression of the hive compression pack
mkdir-p/usr/derby is used to store the appropriate files for Derby decompression
//Indicates the hive configuration file in the system
Export Path=$path:$hive_home/bin
//This can be achieved, as long as the input hive,hive service will automatically respond, without having to enter the absolute path hive.
export hive_lib= $HIVE _home/lib
because Hadoop is already installed, the export of the Hadoop path is no longer explained.
of course for the profile file to take effect immediately:
Source/etc/profile
this can take effect.
-------------------------------------
Next is the provisioning of the hive configuration file
should first switch to the directory:
cd/usr/hive/conf/
ls
will see: hive-env.sh.template this file,
(. template files are template files that allow users to customize modifications and optimizations by referencing their formats)
its CP and named: hive-env.sh command as follows:
CP hive-env.sh.template hive-env.sh
VI hive-env.sh
will remove the ' # ' in front of the export hadoop_heapsize=1024
, of course, can optimize this default 1024 according to your environment
will remove the ' # ' hadoop_home in front of the export
and let it point to the directory where you have the Hadoop installed (that is, a directory that switches to folders such as Conf,lib,bin with Hadoop in this directory),
(Mine:hadoop_home=/home/hadoop/hadoop)
in fact, in the installation of hive need to specify the principle of hadoop_home basically with
The principle that
need to specify Java_home when installing Hadoop is similar.
Hadoop needs Java support, and hive needs Hadoop for support.
will export hive_conf_dir=/usr/hive/conf and remove the ' # ' number
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.