pigz -parallel gzip

Here is a short description of pigz:

pigz, which stands for parallel implementation of gzip,
is a fully functional replacement for gzip that exploits
multiple processors and multiple cores to the hilt when compressing data.

And for the installation:

sudo yum install pigz

With pigz, if you don’t have many things running on your multi processor machine then you will see a significant improvement when you are gzipping the files.

Enhanced by Zemanta

Quick tip on zipping logs in real time.

Sometimes, some small things that we don’t actually think can be useful are such useful. I faced this couple of days back when I was working on something and the amount of logs getting generated and the files getting rotated was too fast. If I had to use this for sometime, I needed some script, application or something to make sure that the logs are zipped every few seconds. Finding an application for this would take time and what good is bash if we need to find applications for this. So, a simple bash command did the trick. Most of us would know this but applying it and using it at the right time, was what saved my life. Thanks to bash. Here is the command:

for i in *.log


gzip $i

sleep 5


Can it get simpler than this 🙂

concatenate compressed and uncompressed logs

concatenate compressed and uncompressed logs

$ find /var/log/apache2 -name \'access.log*gz\' -exec zcat {} ; -or -name \'access.log*\' -exec cat {} ; This command allows you to stream your log files, including gziped files, into one stream which can be piped to awk or some other command for analysis.

Note: if your version of \’find\’ supports it, use:

find /var/log/apache2 -name \'access.log*gz\' -exec zcat {} + -or -name \'access.log*\' -exec cat {} +

by David Winterbottom (codeinthehole.com)

URL: http://feedproxy.google.com/~r/Command-line-fu/~3/iwFUyltYgjM/concatenate-compressed-and-uncompressed-logs