Hive--hive supported file formats and compression algorithms (1.2.1) __hive

Source: Internet
Author: User
Tags bz2 gz file hdfs dfs
Overview

As long as the correct file types and compression types (such as Textfile+gzip, sequencefile+snappy, etc.) are configured, hive can read and parse data as expected and provide SQL functionality.

The structure of the sequencefile itself has been designed to compress content. So for the Sequencefile file compression, not the Sequencefile file, and then the file compression. Instead, the Content field is compressed when the Sequencefile file is generated. After the final compression, the external still embodies as a sequencefile.

Rcfile, Orcfile, parquet, Avro are treated in the same way as sequencefile for compression. file Format textfile Sequencefile Rcfile Orcfile Parquet Avro codec for compression algorithm

divisible
Serial Number compressed Format algorithm Multiple Files Tools tool after compression extension
1 DEFLATE DEFLATE No No No . Deflate
2 Gzip DEFLATE No No Gzip . gz
3 Bzip2 Bzip2 No Is Bzip2 . bz2
4 Lzo Lzo No No Lzop . Lzo
5 LZ4 ??? ?? ?? ??? ???
6 Snappy ??? ?? ?? ??? ???
7 ZLIB ??? ?? ?? ??? ???
8 Zip DEFLATE Is Yes, within the scope of the file Zip . zip
textfile text files, not compressed
1
2
3
4
5
6 7 8
--Create a table formatted as a text file: Create
EXTERNAL table Student_text (ID string, name string)
ROW format delimited 
    FIELDS Terminated by ', ' 
    LINES terminated by ' \ n '
STORED as textfile;
--Import data into this table, start Mr Task
INSERT OVERWRITE Table Student_text SELECT * from student;

You can see that the resulting data file is formatted as a uncompressed text file:

HDFs dfs-cat/user/hive/warehouse/student_text/000000_0

1001810081,cheyo
1001810082,pku
1001810083,rocky
1001810084,stephen
2002820081,sql
2002820082,hello
2002820083,hijj
3001810081,hhhhhhh
3001810082,ABBBBBB
text file, deflate compression
1
 2
 3
 4
 5
6 7 8 9 (14)
--Create a table formatted as a file file:
CREATE TABLE Student_text_def (ID string, name string)
ROW format delimited
    FIELDS Terminated by ', '
    LINES terminated by ' \ n '
STORED as textfile;
--Set compression type for gzip compression
set hive.exec.compress.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression.codec=org.apache.hadoop.io.compress.defaultcodec; 
--Import data:
INSERT OVERWRITE TABLE student_text_def SELECT * from student;
--View Data
SELECT * from Student_text_def;

Viewing the data file, you can see that the data file is multiple. deflate files.

HDFs dfs-ls/user/hive/warehouse/student_text_def/
-rw-r--r--   2015-09-16 12:48/user/hive/warehouse/ Student_text_def/000000_0.deflate
-rw-r--r--   2015-09-16 12:48/user/hive/warehouse/student_text_def/ 000001_0.deflate
-rw-r--r--   2015-09-16 12:48/user/hive/warehouse/student_text_def/000002_0.deflate
text files, gzip compression
1
 2
 3
 4
 5
6 7 8 9 (14)
--Create a table formatted as a file file:
CREATE TABLE Student_text_gzip (ID string, name string)
ROW format delimited
    FIELDS Terminated by ', '
    LINES terminated by ' \ n '
STORED as textfile;
--Set compression type for gzip compression
set hive.exec.compress.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression.codec=org.apache.hadoop.io.compress.gzipcodec;
--Import data:
INSERT OVERWRITE TABLE student_text_gzip SELECT * from student;
--View Data
SELECT * from Student_text_gzip;

Viewing the data file, you can see that the data file is multiple. gz files. To unlock the. gz file, you can see the plaintext text:

HDFs dfs-ls/user/hive/warehouse/student_text_gzip/
-rw-r--r--  2015-09-15 10:03/user/hive/warehouse/ student_text_gzip/000000_0.gz
-rw-r--r--  2015-09-15 10:03/user/hive/warehouse/student_text_gzip/000001_ 0.gz
-rw-r--r--  2015-09-15 10:03/user/hive/warehouse/student_text_gzip/000002_0.gz
text file, BZIP2 compression
1
 2
 3
 4
 5
6 7 8 9 (14)
--Create a table formatted as a file file:
CREATE TABLE student_text_bzip2 (ID string, name string)
ROW format delimited
    FIELDS Terminated by ', '
    LINES terminated by ' \ n '
STORED as textfile;
--Set compression type to BZIP2 compression:
set hive.exec.compress.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression.codec=org.apache.hadoop.io.compress.bzip2codec;
--Import data
INSERT OVERWRITE TABLE student_text_bzip2 SELECT * from student;
--View data:
SELECT * from STUDENT_TEXT_BZIP2;

Viewing the data file, you can see that the data file is multiple. bz2 files. To unlock the. bz2 file, you can see the plaintext text:

HDFs dfs-ls/user/hive/warehouse/student_text_bzip2
-rw-r--r--  2015-09-15 10:09/user/hive/warehouse/ student_text_bzip2/000000_0.bz2
-rw-r--r--  2015-09-15 10:09/user/hive/warehouse/student_text_bzip2/ 000001_0.bz2
-rw-r--r--  2015-09-15 10:09/user/hive/warehouse/student_text_bzip2/000002_0.bz2
text file, Lzo compression
1
 2
 3
 4
 5
6 7 8 9 (14)
--Create TABLE creation table
Student_text_lzo (ID string, name string)
ROW FORMAT delimited
    FIELDS terminated by ', '
    LINES terminated by ' \ n '
STORED as Textfile;
--Set to Lzo compression set
hive.exec.compress.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression.codec=com.hadoop.compression.lzo.lzopcodec;
--Import data
INSERT OVERWRITE TABLE Student_text_lzo SELECT * from student;
--Query Data
SELECT * from Student_text_lzo;

Viewing the data file, you can see that the data file is multiple. Lzo compression. To unlock the. lzo file, you can see the plaintext text.

not measured, need to install LZOP library text file, lz4 compression

1
 2
 3
 4 5 6 7 8
 9
ten
-one 12
--Create TABLE creation table
student_text_lz4 (ID string, name string)
ROW FORMAT delimited
    FIELDS terminated by ', '
    LINES terminated by ' \ n '
STORED as Textfile;
--Set to LZ4 compression set
hive.exec.compress.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression.codec=org.apache.hadoop.io.compress.lz4codec;
--Import data
INSERT OVERWRITE TABLE student_text_lz4 SELECT * from student;

Viewing the data file, you can see that the data file is multiple. LZ4 compression. Using cat to view the. lz4 file, you can see the compressed text.

HDFs dfs-ls/user/hive/warehouse/student_text_lz4
-rw-r--r--2015-09-16 12:06/user/hive/warehouse/student_text _lz4/000000_0.lz4
-rw-r--r--2015-09-16 12:06/user/hive/warehouse/student_text_lz4/000001_0.lz4
- rw-r--r--2015-09-16 12:06/user/hive/warehouse/student_text_lz4/000002_0.lz4
text file, snappy compression
1
 2
 3
 4
 5
6
7 8 9 17
--Create TABLE creation table
student_text_snappy (ID string, name string)
ROW FORMAT delimited
    FIELDS terminated by ', '
    LINES terminated by ' \ n '
STORED as textfile;
--Set compressed set
hive.exec.compress.output=true;
SET mapred.compress.map.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression=org.apache.hadoop.io.compress.snappycodec;
SET Mapred.output.compression.codec=org.apache.hadoop.io.compress.snappycodec;
SET Io.compression.codecs=org.apache.hadoop.io.compress.snappycodec;
--Import data
INSERT OVERWRITE TABLE student_text_snappy SELECT * from student;
--Query Data
SELECT * from Student_text_snappy;

Viewing the data file, you can see that the data file is a multiple. Snappy compressed file. Using cat to view the. snappy file, you can see the compressed text:

HDFs dfs-ls/user/hive/warehouse/student_text_snappy
Found 3 items
-rw-r--r--   2015-09-15 16:42/user/ Hive/warehouse/student_text_snappy/000000_0.snappy
-rw-r--r--   2015-09-15 16:42/user/hive/warehouse/ Student_text_snappy/000001_0.snappy
-rw-r--r--   2015-09-15 16:42/user/hive/warehouse/student_text_snappy /000002_0.snappy
Sequencefile sequence file, deflate compression
1
 2
 3
 4
 5
6 7 8 9 (14)
--Create a table formatted as a file file:
CREATE TABLE Student_seq_def (ID string, name string)
ROW format delimited
    FIELDS Terminated by ', '
    LINES terminated by ' \ n '
STORED as Sequencefile;
--Set compression type for gzip compression
set hive.exec.compress.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression.codec=org.apache.hadoop.io.compress.defaultcodec; 
--Import data:
INSERT OVERWRITE TABLE student_seq_def SELECT * from student;
--View Data
SELECT * from Student_seq_def;

View data file, is a ciphertext file.

HDFs dfs-ls/user/hive/warehouse/student_seq_def/
-rw-r--r--  /user/hive/warehouse/student_seq_def/ 000000_0
sequence files, gzip compression
1
 2
 3
 4
 5
6 7 8 9 (14)
--Create a table formatted as a file file:
CREATE TABLE Student_seq_gzip (ID string, name string)
ROW format delimited
    FIELDS Terminated by ', '
    LINES terminated by ' \ n '
STORED as Sequencefile;
--Set compression type for gzip compression
set hive.exec.compress.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression.codec=org.apache.hadoop.io.compress.gzipcodec;
--Import data:
INSERT OVERWRITE TABLE student_seq_gzip SELECT * from student;
--View Data
SELECT * from Student_seq_gzip;

Viewing the data file is a ciphertext file that cannot be uncompressed via gzip:

HDFs dfs-ls/user/hive/warehouse/student_seq_gzip/
-rw-r--r--  /user/hive/warehouse/student_seq_gzip/ 000000_0
Rcfile Rcfile,gzip Compression
 1 2 3 4 5 6 7 8 9 of 
 CREATE Table St
Udent_rcfile_gzip (ID string, name string) ROW FORMAT delimited FIELDS terminated by ', ' LINES terminated by ' \ n '

STORED as Rcfile;
--Set compression type for gzip compression set hive.exec.compress.output=true;
SET mapred.output.compress=true;
SET Mapred.output.compression.codec=org.apache.hadoop.io.compress.gzipcodec; --Import data: INSERT OVERWRITE TABLE student_rcfile_gzip SELECT 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.