Thursday, February 7th, 2008

JavaScript Library Loading Speed

Category: JavaScript, Performance

John Resig has analyzed JavaScript library loading speed by looking into the recent PbWiki testing results.

He delves into the fact that file size != speed and puts out the simple formula:

Total_Speed = Time_to_Download + Time_to_Evaluate

We also seem to obsess about packing and minification, where it often does give us that much since the act of gziping the data often does enough. Thay being said, if you have a lot of JavaScript it can certainly be worthwhile. It matters the most that the frameworks themselves (which are normally bigger than the app) play nice.

In fact, walk around your site with Firebug/YSlow and see if you have set your headers up correctly. After watching Steve Souders at work, it boggles my mind how many big sites are misconfigured (let alone small ones).

Posted by Dion Almaer at 11:30 am
3 Comments

+++--
3.4 rating from 27 votes

3 Comments »

Comments feed TrackBack URI

Minification reduces Time_to_Evaluate too

Comment by andytesti — February 7, 2008

sure is amusing but how sites are delivered wrong even by those in the know but do note that things are measurably improving, at port80software you can see the increase in compression clearly now – http://www.port80software.com/surveys/top1000compression/ Caching is a bit more mixed http://www.port80software.com/surveys/top1000cachecontrol/ It suggests caching is even more popular but it turns out that the sites that are using cache control are generally busting caches. Add to this the Ajax and caching fun and there is even more of this going on.

Screen paint time is clearly important to measure as John points out but such thinking is not limited to just JavaScript evaluation, flash playing and even JPEG decompression is obvious and noticeable if you are too aggressive regardless of delivery time. You can of course just play a bunch of media files or run lots of tasks at once and your page will run slow in the browser window…surprise surprise multitasking OSes at play. :-)

Comment by Thomas Powell — February 7, 2008

The Port80 stats are awesome! But I wanted to point out a couple shortcomings in both the compression and caching analysis performed.

The compression stats only look at the HTML document. It certainly is important to compress the HTML document, but it’s important, perhaps even *more* important, to compress scripts and stylesheets. I submitted this URL, “http://stevesouders.com/hpws/gzip-html.html”, to the Compression Check form and it didn’t notice that there was 120K of uncompressed JavaScript and CSS. It would be awesome if Port80 could extend their test framework to check these additional resources. As mentioned in the next paragraph their framework is capable of detecting these additional resources in the page.

The cache control analysis reports the presence of cache control headers, but it does *not* indicate if those headers make the resources cacheable by the browser. I submitted this URL, “http://stevesouders.com/hpws/expiresoff.php”, to their “Cache Check” form and it gave all the resources a green check that cache expiration information was present, but unfortunately the information made the resources *not* cacheable. (This is an example of how *not* to do caching.) Checking that the expiration information is in the future would be a great enhancement to this test.

Comment by souders — February 8, 2008

Leave a comment

You must be logged in to post a comment.