1. Install dependent packages
Yum Install rsync gcc openldap-develpython-ldapmysql-develpython-devel python-setuptools Python-simplejson Sqlite-devellibxml2-devel Libxslt-devel Cyrus-sasl-devel
2. Download the CDH version of Hue
wget http://archive-primary.cloudera.com/cdh5/cdh/5/hue-3.7.0-cdh5.4.2.tar.gz
3. Decompression and Installation
Tar zxvf hue-3.7.0-cdh5.4.2.tar.gz
CD hue-3.7.0-cdh5.4.2
Make install Prefix=/hue Hadoop_home=/home/hadoop/hadoop
Ln-s/hue/hue/desktop/libs/hadoop/java-lib/hue-plugins-3.7.0-cdh5.4.2.jar/home/hadoop/hadoop/lib
4. Create a hue-related MySQL data
Mysql-u root-p
Create database hue;
Grant all on hue.* to ' hue ' @ ' localhost ' identified by ' Ab1234567890 ';
Grant all on hue.* to ' hue ' @ ' HD1 ' identified by ' Ab1234567890 ';
Grant all on hue.* to ' hue ' @ ' percent ' identified by ' Ab1234567890 ';
5. Sync Hue Initialization data
/hue/hue/build/env/bin/hue Migrate
/hue/hue/build/env/bin/hue syncdb
6. Configuration Files
Vim/hue/hue/desktop/conf/hue.ini
[Desktop]
secret_key=xxxx11112222
HTTP_HOST=HD1 (can write IP)
http_port=7777
Time_zone=asia/shanghai
Server_user=hadoop
Server_group=hadoop
Default_user=hadoop
Default_hdfs_superuser=hadoop
Default_site_encoding=utf-8
[Hadoop]
fs_defaultfs=hdfs://hd1:9000
Webhdfs_url=http://hd1:14000/webhdfs/v1
Hadoop_conf_dir= '/home/hadoop/hadoop/etc/hadoop '
[Beeswax]
Hive_server_host=hd1
hive_server_port=10000
Hive_conf_dir=/usr/local/spark/spark-1.3.0-bin-hadoop2.3/conf
Other functions can not be configured if not used,
Beeswax is hive, if you want to use Spark-sql, you must first ensure that spark thriftserver is working properly. Then hue will be sent directly to Sparksql Thriftserver via beeswax bar hiveql.
HUE 3.7.0 Installation and configuration