The kettledesktop version of Pentaho was just started. Here we mainly use its association with hadoop and hive for data processing. Kettle's version is 4.4, and The use process is quite smooth. A conversion task has been successfully established to extract data from hive to a local file. However, once you open it, all utf8 Chinese characters are
The kettle desktop version of Pentaho was just started. Here we mainly use its association with hadoop and hive for data processing. Kettle's version is 4.4, and The use process is quite smooth. A conversion task has been successfully established to extract data from hive to a local file. However, once you open it, all utf8 Chinese characters are
The kettle desktop version of Pentaho was just started. Here we mainly use its association with hadoop and hive for data processing. Kettle's version is 4.4, and The use process is quite smooth. A conversion task has been successfully established to extract data from hive to a local file. However, the Chinese utf8 is garbled. Kettle only supports hive0.7 and does not support 0.8. Therefore, it cannot correctly extract hive meta information, but does not affect the normal operation of HQL.
We can only look at how kettle uses the hive jdbc connection first. I first copied the hive-jdbc.0.8.1.ar to {kettlehome}/libext/JDBC, directly causing failure to connect to hive.
There is a jar file hive-jdbc-0.7.0-pentaho-1.0.2.jar in this directory, this class is an adaptation class, does not really implement hive jdbc connection.
Instead, the hivejdbc class under classpath is found through reflection, the jar file exists in {kettlehome} \ plugins \ pentaho-big-data-plugin \ hadoop-deployments \ hadoop-20 \ lib \ hive-jdbc-0.7.0-pentaho-1.0.2.jar, which is used to call hive.
Let's take a look at the implementation in this jar. You can obtain the source file from the following url first. After downloading and unzipping the http://repo.pentaho.org/artifactory/repo/org/apache/hive/hive-jdbc/0.7.0-pentaho-1.0.2/hive-jdbc-0.7.0-pentaho-1.0.2-sources.jar, pour it into your own new java project and introduce the relevant class libraries to make it compile normally.
StructObjectInspector soi = (StructObjectInspector) serde. getObjectInspector (); List fieldRefs = soi. getAllStructFieldRefs (); // Object data = serde. deserialize (new BytesWritable (rowStr. getBytes (); // We will block this row from Object data = serde. deserialize (new BytesWritable (rowStr. getBytes ("UTF-8"); // use this line
Then add the compiled class file to the hive-jdbc-0.7.0-pentaho-1.0.2.jar
Restart kettle.
Then run the process again. It is normal. Of course, if the encoding of your system environment is utf8, this problem should not occur.
Original article address: Kettle connected to Hive Chinese garbled problem solution. Thank you for sharing it.