Thursday, August 21st, 2008

Page Test: Run AOL’s tool in the cloud, then sit back and wait

Category: Performance

Patrick Meenan has setup an IE7 instance in Virginia that we can poke to do an AOL Page Test.

You give it a URL and some options such as the number of runs, whether to see the first and repeat views, and off it runs.

When finished you get to see the results which give you high level data on load times, waterfall graphs, an optimization check list, and a screenshot of what the browser saw.

If the waterfall is hard to read, send it to Steve Souders. He reads them like Neo reads the Matrix :)

Posted by Dion Almaer at 10:48 am
6 Comments

+++--
3.7 rating from 15 votes

6 Comments »

Comments feed TrackBack URI

great tool, but i stlill like http detailer from IBM

Comment by V1 — August 21, 2008

fine tool but, didn’t we already have http://tools.pingdom.com/ for the same purpose and in the same way?

Comment by gautamkishore — August 21, 2008

Why is it saying etags are bad on static items?
That’s a new one to me.

Comment by ck2 — August 22, 2008

Ah I found my own answer. Etags are only bad when you are on a multiple-server system that may send the same file from any given server – etags are typically system based and may vary from server to server so a browser that requests a file from one server but may end up talking with a different server later will fail the etag expire check and re-request the file even though it has not changed.

Etags are not an issue for single-server setups though.

Comment by ck2 — August 22, 2008

It’s basically a combination of pingdom, page detailer and YSlow (with a couple more features).

Pingdom is nice and the UI in particular with the live feedback is great. The main problem is that it doesn’t use a real browser so I wouldn’t recommend using it for investigating how to optimize your site. The load order/parallel connections is completely wrong and it doesn’t run javascript or flash (aol.com comes back with 46 requests instead of the 88 that really make up the page and the waterfall looks completely wrong).

The desktop version on sourceforge is pretty close to what page detailer provides thought page detailer is more polished. We tend to use the hosted version more though since we can test at different connection speeds and don’t have to deal with clearing the browser’s cache and cookies. For more complicated apps that keep state we’ll revert to using the desktop version. I released it under a BSD license because I wanted people to be able to pull out paprts of the code and use it for other things if they wanted to (of particular interest are the API and winsock hooking code).

The optimization checks it performs automates some of the easier things for a machine to look at and provide a basic starting point. Some nice unique features there are the savings estimates for gzip and image compression (cnn.com could save 366 KB by compressing their js on an 868 KB page that’s pretty significant). The CDN detection is also pretty good and can generically detect all of the top providers (and if you’re using a CDN it doesn’t recognize, let me know and I can get the provider added). There’s no need for telling the tool about specific fqdn’s that are on a CDN.

Thanks,

-Pat

Comment by PatrickMeenan — August 22, 2008

ck2: multiple servers are not always bad, it depends on your load balancer configuration and target group.

We are have multiple servers, Etags configured and no problems. This is because our load balancer will assign server locations to our users and makes them use the same server for a session period of 10 min. So it just depends on your server config, your target group (how long will they actually stay on your website?) etc. :)

Comment by V1 — August 23, 2008

Leave a comment

You must be logged in to post a comment.