Tuesday, September 30th, 2008

Smushit.com makes image optimizing a breeze

Category: Performance, The Ajax Experience

>We’ve heard a lot about optimizing CSS, HTML and JavaScript but one thing that is less talked about is how much extra information image editors put into image files. You might think you’ve done a great job optimizing your GIFs, PNGs and JPGs while still keeping them visually pleasing but when you use a text editor you’ll realize that there is quite a big amount of data you can save by removing information about the image editor used, the date the file was edited last and lots of other bits that really are redundant.

There are a lot of free tools that strip this information from the files for you and squeeze some extra optimization out of the file without affecting the look. The problem is that all of them are command-line based and you need to know how to use them. Stoyan Stefanov and Nicole Sullivan of the Yahoo exceptional performance team took all of these tools and their experience in using them and built one application that does all the optimizations for you in one go:

Smushit Screenshot

You can upload images, give it a URL or use smushit as a Firefox extension or bookmarklet. Smushit will show you how many bytes you can save by removing cruft from the images and gives you all the images as a zip file to replace them on your site.

Here’s a video of Stoyan and Nicole presenting Smushit.com at The Ajax Experience in Boston (sorry about the audio):

Related Content:

Posted by Chris Heilmann at 11:01 am

3.7 rating from 20 votes


Comments feed TrackBack URI

What is a Smu?

Comment by jeromew — September 30, 2008

lol… on a more serious note, would this also be stripping out the digit info that help to identify an image in the case of image theft?

Comment by jonhartmann — September 30, 2008

@jeromew lol

@jonharmann Yes, Smush it does strip metadata that can contain copyright information.

Stoyan and I have talked about offering an option to keep certain bits of metadata, like copyright. For instance, getty requires some JPG to be served with a thumbnail inside the original image.

Is this a feature you would find useful?

Comment by Stubbornella — September 30, 2008

Nice :)

Comment by ThomasHansen — September 30, 2008

This is really cool. I was able to see gains of over 40% in some tests I ran.

Last year at the TAE YUI performance session, the audience was asked for suggestions and I suggested incorporating such functionality into YSlow. Would be great it this is integrated into YSlow numbers, but the current interface is really nice too.

Comment by sjivan — September 30, 2008

Although I like the idea, I’m not sure that it would change much… if I choose to leave the information in my optimized images, someone else could point smush.it at my image and strip them out… you’d have to do something like add a bit of data to the file that says it has been smushed and then have smush.it not to smush it again.

Comment by jonhartmann — September 30, 2008

jonhartmann, don’t be silly. You’re worrying about the small slice of users who are web savvy, then the smaller subset of those users who know about an use smush.it, and then the still smaller slice of those users who also pirate images, and then the infinitesimally small slicer of those users who also visit websites made by smush.it users? Even assuming smush.it gets massive market penetration, what’s to prevent would-be pirates from using one of the legions of desktop tools to edit your metadata or even just take a plain-old screenshot of your content and re-save it?

On topic: Does this strip out much more than Photoshop’s “Save for Web”? I love seeing tools like this online for sure, but I’m only seeing 1 or 2% reductions on all my web photos.

Comment by dtetto — September 30, 2008

A great tool. The multiple upload function is cool, but would be nice if it was also possible to upload a ZIP archive containing multiple images.

Comment by mattcurrie — September 30, 2008

Am I the only one that would like this locally? Any chance of that? That said, very nice. Thanks for your efforts.

Comment by NICCAI — September 30, 2008

pngcrush is the open source command line utility I suspect they’re using on the backend to compress pngs. Not sure about other formats.

Comment by tlrobinson — September 30, 2008

Image editors do leave a crapton of extra meta data in images, but Photoshop’s “save for web” works really well to strip out all of this data and just save the raw image. I don’t think this service would do a better job than Photoshop, except that Photoshop is like a million dollars, so i guess it has an advantage in that department!

Comment by schammy — September 30, 2008

wikipedia says:
smu = small microscopic undigested

Comment by Jordan1 — September 30, 2008

I smushed this post and:

Smushed 14.12% or 13.84 KB from the size of your image(s). How did we do it? See the table below for more details.


Comment by waynep — September 30, 2008

I haven’t used smu but pngcrush can definitely reduce png file sizes significantly from Photoshop’s “save for web” output. In my experience, usually somewhere between 15-35%.

Comment by eyelidlessness — October 1, 2008

Here’s what I think the killer feature would be: let us put webpage URLs (not images); so I’d enter a given page and the script would find all the images (image tags and CSS URLs) and then process them and zip ‘em up for me. That way I could quickly go through my live sites (and any new sites, post-deployment) and shrink the size of the whole page.

Heck, combine in the minify project for minifying JS, CSS, and HTML and you’ve got a real site optimizer on your hands!

Comment by dtetto — October 1, 2008

I’m not really worried about image piracy, and like you said, the tools already exist, I’m just pointing out that this tool might have other uses then just optimizing images for the web. Is the loss of that metadata worth the bandwidth savings? As a developer, I’d say it is, but a graphics artist might not appreciate it.

Comment by jonhartmann — October 1, 2008

PngOptimizer is better than PngCrush and PngOut (and comes with source) http://psydk.org/PngOptimizer.php

jpeg is a whole other problem with it being lossy – however there are now more and more tools to do lossless cropping which is nice.

Comment by ck2 — October 1, 2008

@NICCAI = we have a command-line version and intend to open-source the tool, so you can run it locally and go like:

$smu /path/to/my/s**t ;)

Just need some time to catch our breath, after the ajax experience sprint ;)

@tlrobinson – yes, pngcrush for PNGs, jpegtran to jpegs, image magick to figure out which file is which and also to convert gifs to png8

@dtetto – that’s what the Firefox extension, companion to the tool, does. You visit any page, click the extension icon, and you’re given the results, also an option to download a zip. And yes, the idea is that once you have smush.it installed locally, you can make it part of the build process, together with the minification.

@ck2 – pngcrush seemed like an easy choice, it’s a command-line tool and can be automated. The idea was to have something complete working, then we want to further play with the other optimizers, like optipng, pngrewrite and any others, so we can squeeze the max possible. JPEG is lossy, true, but jpegtran is he tool we use that allows for lossless operations such as stripping EXIF data.

Comment by stoyan — October 2, 2008

Thats very useful

Comment by Remedies — November 19, 2008

Leave a comment

You must be logged in to post a comment.