After the project was completed, we found the tragedy. By default, sqoop was used to list data tables from Oracle databases. If the data accuracy is greater than 15 digits, some fields in the imported table are of the double type by default. As a result, more than 16 fields are imported to hive. The query time is only 15-bit precise. Sorry, remember.
Hadoop cluster-based Hive Installation
Differences between Hive internal tables and external tables
Hadoop + Hive + Map + reduce cluster installation and deployment
Install in Hive local standalone Mode
WordCount word statistics for Hive Learning
Public class HelloWorld {
Public static void main (String args []) {
// Double dou = 9813113054842628;
String s = "9813113054842628 ";
System. out. println (Double. valueOf (s ));
String s1 = "9813113054842627 ";
System. out. println (Double. valueOf (s1 ));
}
}
Output result
9.813113054842628E15
9.813113054842628E15
The reason is that the oracle Number precision is greater than the double type in java.
Hive details: click here
Hive: click here
This article permanently updates the link address: