Sqoop installation and use-experimental

Source: Internet
Author: User

Sqoop is used to import and export data. (1) Import data from databases such as MySQL, Oracle, etc. into HDFs, Hive, HBase (2) Export data from HDFs, Hive, hbase to MySQL, Oracle and Other databases (3) Import and export transactions are in mapper task units. 1, Sqoop installation steps1.1, Execution command: TAR-ZXVF sqoop-1.4.3.bin__hadoop-1.0.0.tar.gz decompression1.2. Execute command: MV sqoop-1.4.3.bin__hadoop-1.0.0 sqoop Rename Path
1.3. Configure environment variables: and execute command: Source/etc/profile
Export Sqoop_home=/home/hadoop/app/sqoopexport path=.: $SQOOP _home/bin: $HADOOP _home/bin: $ZOOKEEPER _home/bin:$ Hbase_home/bin: $HIVE _home/bin: $JAVA _home/bin: $PATH
View Code

1.4, Cp/home/hadoop/app/mysql-connector-java-5.1.10.jar $SQOOP _home/lib/

 2, the use of Sqoop2.1. Import data from MySQL to HDFsMedium: The default path is/USER/<USERNAME&GT, note that the username here is the user who is logged on to the host, not the database user
Sqoop                                        # #sqoop命令import                                          # #表示导入--connect jdbc:mysql://hadoop:3306/test # #告诉jdbc, connect MySQL URL, Where test represents the database name to export data--username root                            # #连接mysql的用户名--password hadoop123                     # #连接mysql的密码--table users                             # # Table name exported from MySQL--fields-terminated-by ' \ t '                # #指定输出文件中的行的字段分隔符--null-string ' * * '                         # #指定空值的显示为 **-m 1                                       # # The copy process uses 1 map jobs, the default is 4    --append                                   # #指定插入数据的方式--hive-import                              # #把mysql表数据复制到hive中, if this option is not used, it means copying into HDFs-- Check-column ' ID '                     # #指定增量导入要检查的字段--incremental append                     # #指定增量导入数据的方式--last-value 6                            # #指定增量导入数据的分割值
View Code 2.2.Export data from HDFs to MySQL
Sqoop                        Export                                        # #表示数据从hive复制到mysql中--connect jdbc:mysql://hadoop:3306/test--username Root--password Admin--table Users_bak                            # #mysql中的表, the table name that is about to be imported must exist--export-dir '/hive/users '                    # # The exported file directory in hive--fields-terminated-by ' \ t '                  # #hive中被导出的文件字段的分隔符
View Code

2.3. set as job, run job

Sqoop                                      Job                                        #表示JOB命令--create myjob                             #创建的JOB名称--                                         #不能省略! Definition of the job after import--connect jdbc:mysql://hadoop:3306/test  --username root--password hadoop123--table users-- Fields-terminated-by ' \ t '  --null-string ' * * '-  M 1--append  --hive-import
View Code

View Job:sqoop Job--list

perform job:sqoop job--exec myjob
Delete job:sqoop job--delete myjobafter execution, you will be asked to enter a password, note that the input: job definition contentThe value of the password! If you do not want to enter a password modify the parameters in $sqoop_home/conf/sqoop-site.xml
  < Property>    <name>Sqoop.metastore.client.record.password</name>    <value>True</value>    <Description>If True, allow saved passwords in the Metastore. </Description>  </ Property>
View Code


         

Sqoop installation and use-experimental

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.