Sunday, December 3rd, 2006

Yahoo! Performance Engineers discuss what the 80/20 Rule Tells Us about Reducing HTTP Requests

Category: Yahoo!

<p>Tenni Theurer, a performance engineer at Yahoo!, has written a post on What the 80/20 Rule Tells Us about Reducing HTTP Requests.

It focuses on looking at the entirety of a page load, using the example of yahoo.com:

ASIDE: I have been using the beta of Firebug to see exactly this kind of data when I go to my own pages. It is invaluable, and I couldn’t live without the current Firebug beta.

Our experience shows that reducing the number of HTTP requests has the biggest impact on reducing response time and is often the easiest performance improvement to make.

There is a lot more to this of course. Setting up the correct cache-control settings for items that do not change often is important. The backend is also important. A few architecture changes and you can spend magnitudes more time in the backend processing. We obviously want to start shoving HTML back down the pipe as quickly as possible.

It also shows how much of a difference parallel downloads can make. If you can tweak your browser to do more at once, it can make a big difference (as chances are your bandwidth isn’t the bottleneck).

Related Content:

Posted by Dion Almaer at 10:17 am
8 Comments

++++-
4.3 rating from 30 votes

8 Comments »

Comments feed TrackBack URI

Firebug beta! Ha! hopefully we can use it soon

Comment by March — December 3, 2006

hi there,

wasn’t there a related recent study by a google engineer ? There, he concluded that the upload bandwidth(not the donwload bandwidth) does make a difference for AJAX sites .

it’s interesting to know more about these analyses — good to know that FireBug has some tools for us developers.

BR,
~A

Comment by anjan bacchu — December 3, 2006

Firebug 1.0 – much anticipated. I do hope Joe Hewitt releases something soon.

0.41 is just breaking my heart :(

Comment by Richard — December 3, 2006

Dion, good point on parallel downloads, but relying on users to tweak their browser settings really isn’t tenable. The best approach for parallelizing connections is to split content out between different hosts. Most browsers are configured to download 2 objects per host, so images.mysite.com and css.mysite.com will be allocated 2 connections each. Note that the total cap on parallel connections is 6.

Comment by Ryan Breen — December 3, 2006

See also Google’s little-known Load Time Analyzer Firefox extension. If you don’t want to pay for Firebug 1.0, then check out LTA.

Comment by RichB — December 3, 2006

Lucky!

Tamper Data for Firefox also has a similar load graphing function, it’s quite nice. (Still want Firebug 1.0 though).

Comment by Mark Kawakami — December 3, 2006

[...] It was only slightly ironic that various caching/compression mechanisms were highlighted again this weekend on ajaxian.com, as I was already well underway with some “improvements” of my own in this area with Tapestry. [...]

Pingback by opencomponentry » Blog Archive » gzip, where have you been all my life..? — December 3, 2006

[...] Ajaxian » Yahoo! Performance Engineers discuss what the 80/20 Rule Tells Us about Reducing HTTP Requests (webdev code performance HTML web) IEBlog : IE6 and IE7 Running on a Single Machine (IE7 IE6 Development virtualpc ) [...]

Pingback by CyanCode » Blog Archive » Code Links (12.06.2006) — December 6, 2006

Leave a comment

You must be logged in to post a comment.