I find the Page Speed tool from Google extremely helpful for optimising my websites. I have a tip which can save time if your website is failing the “Optimize Images” test. Using the in-built Smush.it is one option but if most of your images are jpegs, you can achieve the same result and save time using a command line tool called jpegtran with the -optimise
parameter to do lossless compression and the -copy none
parameter to strip out image meta-data.
Disclaimer: As with any image processing always keep a copy of the original images.
To install jpegtran in Ubuntu, do:
sudo apt-get install libjpeg-progs
- To optimise a single jpeg image:
jpegtran -copy none -optimise -outfile image.jpg image.jpg - To optimise all jpegs in the current directory:
for img in `ls *.jpg`; do jpegtran -copy none -optimise -outfile $img $img; done - To optimise all jpegs in the current directory and all child directories:
find . -name “*.jpg” -print0 | xargs -0 -I filename jpegtran -copy none -optimise -outfile filename filename
If you need to optimise various image formats, there is a PHP CLI tool called smusher which uses Smush.it and can work recursively on directories. Might be worth a look. It would be nice if Smush.it had a API – their FAQ mentions they are working on it.