1. Missing MySQL driver package
1.1 Problem Description
caused by:org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException:The specified datastore Driver ("Com.mysql.jdbc.Driver") was wasn't found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
At Org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver ( abstractconnectionpoolfactory.java:58) at
Org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool ( bonecpconnectionpoolfactory.java:54) at
Org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources (connectionfactoryimpl.java:213)
1.2. Solution
The above problem is probably a missing MySQL jar package, download mysql-connector-java-5.1.32.tar.gz, and copy it to the Lib directory of hive:
xiaosi@yoona:~$ CP Mysql-connector-java-5.1.34-bin.jar opt/hive-2.1.0/lib/
2. Metabase MySQL initialization
2.1 Problem Description
Unable to enter when running the./hive script, Error:
Exception in thread "main" java.lang.RuntimeException:Hive Metastore database was not initialized. Please use Schematool (e.g./schematool-initschema-dbtype ...) to create the schema. If needed, don ' t forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ? createdatabaseifnotexist=true for MySQL)
2.2 Solutions
Run the Schematool-initschema-dbtype mysql command in the scripts directory to initialize the hive metabase:
xiaosi@yoona:~/opt/hive-2.1.0/scripts$ schematool-initschema-dbtype mysql slf4j:class path contains multiple SLF4J bin
Dings. Slf4j:found Binding in [jar:file:/home/xiaosi/opt/hive-2.1.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/ Staticloggerbinder.class] Slf4j:found binding in [jar:file:/home/xiaosi/opt/hadoop-2.7.3/share/hadoop/common/lib/ Slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/staticloggerbinder.class] Slf4j:see http://www.slf4j.org/codes.html#
Multiple_bindings for an explanation. Slf4j:actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Metastore connection Url:jdbc:mysql://lo Calhost:3306/hive_meta?createdatabaseifnotexist=true Metastore Connection Driver:com.mysql.jdbc.Driver metastore
Connection User:root Starting Metastore schema initialization to 2.1.0 initialization script Hive-schema-2.1.0.mysql.sql Initialization script completed Schematool completed
3. Relative path in absolute URI
3.1 Problem Description
Exception in thread "main" java.lang.IllegalArgumentException:java.net.URISyntaxException:Relative path in absolute URI: ${system:java.io.tmpdir%7d/$%7bsystem:user.name%7d ...
caused by:java.net.URISyntaxException:Relative path in absolute URI: ${system:java.io.tmpdir%7d/$%7bsystem:user.name %7d at
Java.net.URI.checkPath (uri.java:1823) at
java.net.uri.<init> (uri.java:745)
at Org.apache.hadoop.fs.Path.initialize (path.java:202) ...
More
3.2 Solutions
The problem is caused by the use of variables that are not configured, to resolve this problem simply configure the system:user.name and System:java.io.tmpdir two variables in the configuration file Hive-site.xml, which can be used in the configuration file:
<property>
<name>system:user.name</name>
<value>xiaosi</value>
</ property>
<property>
<name>system:java.io.tmpdir</name>
<value>/home/${ System:user.name}/tmp/hive/</value>
</property>
4. Deny Connection
4.1 Problem Description
on exception:java.net.ConnectException: Deny connection;
For more details see:http://wiki.apache.org/hadoop/connectionrefused ... caused By:java.net.ConnectException:Call from qunar/127.0.0.1 to localhost:9000 failed on connection exception:java.net . Connectexception: Deny connection;
For more details see:http://wiki.apache.org/hadoop/connectionrefused ... Caused by:java.net.ConnectException: Deny connection at Sun.nio.ch.SocketChannelImpl.checkConnect (Native Method) at Sun.nio.ch.SocketChannelImpl.finishConnect (socketchannelimpl.java:717) at Org.apache.hadoop.net.SocketIOWithTimeout.connect (socketiowithtimeout.java:206) at Org.apache.hadoop.net.NetUtils.connect (netutils.java:531) at org.apache.hadoop.net.