Hive Summary (vii) hive four ways to import data (strongly recommended to see)
Several methods of data export of Hive https://www.iteblog.com/archives/955 (strongly recommended to see)
Import MySQL data into HDFs
1. Manually import using MySQL tools
The simplest way to import MySQL's exported data into HDFs is to use command-line tools and MySQL statements.
To export the contents of the entire data table or the entire database, MySQL provides the mysqldump tool.
Like what
SELECT col1,col2 FORM TABLE
Into OUTFILE '/tmp/out.csv '
Fields TERMINATED by ', ', LINES TERMINATED by ' \ n ';
This learning format is good, the specific table to see their own.
Once we export the data to a file, we can use Hadoop fs-put to move the file from local Linux to HDFs.
2. Use Sqoop to import MySQL data into HDFs
$ sqoop Import--connect jdbc:mysql://192.168.80.128/hive--username hive \ >--password hive--table Empl Oyees
Note that here according to your own, the metabase name, user name and password
Import MySQL data into hive
$ sqoop Import--connect jdbc:mysql://192.168.80.128/hive--username Hive-password Hive--table Employees--hi Ve-import--hive-table Employees
In more detail, see
import tables and data from MySQL into hive with Sqoop
Export data from HDFs to MySQL
$ sqoop Export--connect jdbc:mysql://192.168.80.128/hive--username Hive-password Hive--table Employees--exp Ort-dir edata--input-fields-terminated-by ' \ t '
In more detail, see
sqoop1.4.5+hadoop2.2.0 data conversion from MySQL to HDFs
Export data from HDFs to hive
$ sqoop Export--connect jdbc://mysql://192.168.80.128/hive-username hive-password hive--table Employees--ex Port-dir/user/hive/warehouse/employees--input-fields-terminated-by ' \001 '--input-lines-terminated-by ' \ n '
Data import and export between HDFS, Hive, MySQL, Sqoop (strongly recommended to see)