Create a Compressed RC file table in HIVE

Here are the config parameters to set in the hive client when you want to create a compressed RC file table in HIVE. Note: RC files can only be created when the data is already in HDFS. Unfortunately, I haven’t figured out a way for it to work with LOAD DATA LOCAL……

SET hive.exec.compress.output=true;
SET mapred.max.split.size=256000000;
SET mapred.output.compression.type=BLOCK;
SET mapred.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec;
 
29
Kudos
 
29
Kudos

Now read this

Setting up Camus - LinkedIn’s Kafka to HDFS pipeline

Few days ago I started tinkering with Camus to evaluate its use for dumping raw data from Kafka=>HDFS. This blog post will cover my experience and first impressions with setting up a Camus pipeline. Overall I found Camus was easy to... Continue →