Wednesday, December 20th, 2006

Maximizing Performance with Compression and Combination

Category: Articles

<>p>We are having fun watching Firebug’s network mode as we visit various sites, and seeing where the browser spends time waiting.

Yesterday we talked about cheating the system by using CNAMES to get around the number of connections per host limitation.

Today, Niels Leenheer blogged about making your pages load faster by combining and compressing javascript and css files.

The key points from his testing are:

  • Combining the set of JavaScript files into one
  • Compressing that resulting file
  • Caching the compressed file to disk (instead of compressing on the fly for each request)

Niels released his combine.php script that does the work for him.

There seems to be a natural tension between:

“One big compressed file to minimize connections and size”

and:

“Multiple files are good as they can be cached separately (a change to file A affects the entire combined file in the first case), so use the CNAME trick to allow multiple downloads at once”

Related Content:

Posted by Dion Almaer at 7:30 am
10 Comments

+++--
3.8 rating from 23 votes

10 Comments »

Comments feed TrackBack URI

Yeah, that’s a great tip. JavaScript source file loads are a serious bottleneck in the browser because they are handled serially, so anything that increases the performance of JavaScript is a huge win.

Comment by Ryan Breen — December 20, 2006

See also Ed Eliot’s excellent PHP-based js/css combinator script, which is now inclusive of JSMin. The overall approach here is very consistent with what we’re advocating at Yahoo and with YUI — but Dion, your point is an interesting one about the challenges of determining a logical caching strategy that leverages modular, reused css/js chunks and relies on them being downloaded only on the first use. Modularity and aggregation/concatenation seem irreconcilable in many implementations.

Comment by Eric Miraglia — December 20, 2006

We’ve been doing this on vox from the beginning. It works pretty well. We also use another technique to bootstrap the javascript before the page finishes loading. Ads and images don’t slow us down. :)

Comment by David Davis — December 20, 2006

I use the compression of CSS/JS stuff myself, though it happens that I have several files to load. Does that PHP script know which file to load in what sequence? :)

Comment by Mikhail — December 20, 2006

hi, im from argentina. In our company we have serious problems with website performance, our app have 500kb of JS, a lot of CSS and images. One of my task was find the way to improve performance, so we did that,
with javascript files:
1º join all js into one file
2º use custom_rhino.jar to compress de js (no comments, rename with shorter names functions, vars, etc)
3º delete all \n and \r (one line file)

with css files:
1º join all css files into one
2º remove comments and \n \r

with images we use only 1 file for all images, all images are joined in one file, and with use css position to use it in app

and all that data its cached, and send it compressed to the browser (http 1.1)

we obtained an excellent improve doing that.
bye

Comment by Ulises Enrico — December 20, 2006

hi there,

great!

Is there a way(tool/script/task) to do it automatically in the java world where most of us use ant? It will be nice if someone can demonstrate it for the Petstore application. I wish sun takes an initiative in this so that a lot of Java developers can incorporate this in their process. It will be a bonus if eclipse and netbeans have a tookit for the web development modules to automatically enable this.

Thank you,

BR,
~A

Comment by anjan bacchu — December 20, 2006

We have been using this technique at http://www.mapsack.com. One big advantage is that it is possible to version each ‘build’ of the combined javascript, thus stopping any browser caching issues. Basically, every time our combined JS files changed, their version number gets incremented, i.e. comb/explore-23.js. We automatically update the pages to request the most up to date version, therefore when a change is made, all clients will download the new version (as the filename is different).

Comment by Alastair James — December 20, 2006

Anjan,

If you’re using Dojo, just make a custom build. It combines all your code into a single file and doesn’t require that you change the way you include it in your page (unlike all of these other methods). As for compression, there are servelet filters, or if you really care about performance, host these (mostly) static resources on a separate server/port/domain with an HTTP server that’s got gzipping built in (mod_deflate with disk caching should do what you need).

Regards

Comment by Alex Russell — December 21, 2006

Great Post Dion, these techniques are tried and true. And Kae thanks for that input as well – very valuable. I’ve been dynamically building CSS and JS files for some time, but now I’m getting more into the benefits of caching and more often than not, unless the JS deals with navigational functionality I always opt to use external files – since most of our sites are going for multiple page views during a session. Sometimes we use a combination of PHP and mod_rewrite to achieve the caching effects we’re going for.

Comment by Frederick Townes — December 22, 2006

So you want automated compressed JS/CSS files in your builds check out my blog http://nnbs.blogspot.com/ its simple to achieve.

Comment by hat27533 — June 24, 2008

Leave a comment

You must be logged in to post a comment.