Create a Compressed RC file table in HIVE

Here are the config parameters to set in the hive client when you want to create a compressed RC file table in HIVE. Note: RC files can only be created when the data is already in HDFS. Unfortunately, I haven’t figured out a way for it to work with LOAD DATA LOCAL……

SET hive.exec.compress.output=true;
SET mapred.max.split.size=256000000;
SET mapred.output.compression.type=BLOCK;
SET mapred.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec;
 
29
Kudos
 
29
Kudos

Now read this

How to compress Data in Hadoop

Hadoop is awesome because it can scale very well. That means you can add new data nodes without having to worry about running out of space. Go nuts with the data! Pretty soon you will realize that’s not a sustainable strategy… at least... Continue →