Wednesday, February 20, 2013

Flume 1.3.1 Windows binary release online

Andy Blozhou, a Chinese Flume enthusiast provide precompiled Windows binaries of Flume-1.3.1, including a startup bat and Avro bat.
You can grap this build on their website http://abloz.com/flume/windows_download.html :

======== snip ========

This is the flume-ng 1.3.1 windows version for download.

simple usage:
unzip the apache-flume-1.3.1-bin.zip
run bin/flume.bat for agent. 
run bin/flume-avroclient.bat for avro-client. 
Need modify for your own env. 
detail:
(To compile flume-ng on windows, please reference http://mapredit.blogspot.com/2012/07/run-flume-13x-on-windows.html or my chinese version http://abloz.com/2013/02/18/compile-under-windows-flume-1-3-1.html)

1.download the windows version of flume 1.3.1 file apache-flume-1.3.1-bin.zip from http://abloz.com/flume/windows_download.html
2.unzip the apache-flume-1.3.1-bin.zip to a directory.
3.install jdk 1.6 from oracle,and set JAVA_HOME of the env.
download from http://www.oracle.com/technetwork/java/javase/downloads/index.html
4.test agent:
4.1 modify settings of conf/console.conf,conf/hdfs.conf for agent test.
4.2 test source syslog, sink: console out agent
4.2.1 check flume.bat,modify the variables to your env.
4.2.2 click flume.bat
4.2.3 on another computer run command:
echo "<13>test msg" >/tmp/msg
nc -v your_flume_sysloghost port < /tmp/msg
4.2.4 check your syslog host flume output
4.2.5 samples see http://abloz.com/2013/02/18/compile-under-windows-flume-1-3-1.html
4.3 test avro-client
4.3.1 run a avro source flume agent on a node.
4.3.2 modify flume-avroclient.bat and head.txt
4.3.3 run flume-avroclient.bat

tested on windows7 32bit version

enjoy!
Andy
2013.2.20
http://abloz.com

Wednesday, February 6, 2013

LZO Compression with Oozie

It happens, when one of the compression codec is switched to LZO, that Oozie can't start any MR job successfully. Usually this is done per core-site.xml:

<property>
<name>io.compression.codecs</name>
<value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec,org.apache.hadoop.io.compress.BZip2Codec</value>
</property>
<property>
<name>io.compression.codec.lzo.class</name>
<value>com.hadoop.compression.lzo.LzoCodec</value>
</property>

Oozie reports a ClassNotFound error (java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found). To get the jobs running copy or link hadoop-lzo.jar into /var/lib/oozie/ and restart Oozie's server. 

The second, most common issue most people forget is to set the shared lib directory:

[root@hadoop2 ~]# sudo -u hdfs hadoop fs -mkdir  /user/oozie
[root@hadoop2 ~]# sudo -u hdfs hadoop fs -chown oozie:oozie /user/oozie
[root@hadoop2 ~]# mkdir /tmp/share && cd /tmp/share && tar xvfz /usr/lib/oozie/oozie-sharelib.tar.gz
[root@hadoop2 ~]# sudo -u oozie hadoop fs -put share /user/oozie/share

From CDH 4.1 on a jar package is delivered, called uber JAR. It contains only dependencies to other jar files in a lib/ folder inside of it. After enabling this property, the user can use this in their mapreduce jobs and notify Oozie about this special jar file. You can enable this package per oozie-site.xml

<property>
<name>oozie.action.mapreduce.uber.jar.enable</name>
<value>true</value>

When this property is set, users can use the oozie.mapreduce.uber.jar configuration property in their MapReduce workflows to notify Oozie that the specified JAR file is an uber JAR.