Thursday, June 1st, 2006

Serving JavaScript Fast

Category: JavaScript, Programming

Cal Henderson brings us this new (lengthy) article on ThinkVitamin.com, a look at speeding up the one thing that’s really growing large on Web 2.0-type applications – the Javascript running behind the scenes.

With our so-called “Web 2.0” applications and their rich content and interaction, we expect our applications to increasingly make use of CSS and JavaScript. To make sure these applications are nice and snappy to use, we need to optimize the size and nature of content required to render the page, making sure we’re delivering the optimum experience. In practice, this means a combination of making our content as small and fast to download as possible, while avoiding unnecessarily refetching unmodified resources.

He talks about several different approaches, including:

  • Monolith – the bigger the chunks the better, less overhead of loading more than one file for each page execution
  • Splintered Approach – divide it up into multiple subfiles and only load what you need
  • Compression – gzipping up the content to reduce its filesize as sent to the browser
  • Caching– sending headers to correctly cache the javascript file(s)

For each there’s a brief description, the advantages and disadvantages of the method, and a code example (in PHP) . He focuses largely on the caching option, however, and gives a longer example of how to ensure that your files are remotely cached as well as possible to reduce the load times for javascript-heavy pages.

Posted by Chris Cornutt at 8:52 am
3 Comments

++++-
4 rating from 25 votes

3 Comments »

Comments feed TrackBack URI

What about JSMIN (http://www.crockford.com/javascript/jsmin.html) to who can also help you to shrink your JS files.

Comment by Quentin Dubois — June 2, 2006

Its nice to see an article with content, thank you

Comment by Dan — June 2, 2006

[…] Serving JavaScript Fast: ” Cal Henderson brings us this new (lengthy) article on ThinkVitamin.com, a look at speeding up the one thing that’s really growing large on Web 2.0-type applications – the Javascript running behind the scenes. With our so-called ‘Web 2.0’ applications and their rich content and interaction, we expect our applications to increasingly make use of CSS and JavaScript. To make sure these applications are nice and snappy to use, we need to optimize the size and nature of content required to render the page, making sure we’re delivering the optimum experience. In practice, this means a combination of making our content as small and fast to download as possible, while avoiding unnecessarily refetching unmodified resources. […]

Pingback by Serving JavaScript Fast — June 5, 2006

Leave a comment

You must be logged in to post a comment.